A co-operative virtual reality game, an easily trainable neural network, and multiplayer Quick Draw! — come take a peek at some of the experiments on display at Google I/O 2017.

What’s the best way to show off Google’s varying underlying technologies? Make them fun to use!

Every year at I/O, Google dedicates a portion of the show floor to experiments submitted by various developers from around the world. I managed to find some time to try a majority of them in the Experiments tent. Here are a few highlights, as well as links to where you can try out the Experiments for yourself.

OSP

Train your own neural network with this easy-to-program experiment.

Train your own neural network with this experiment, which utilizes Machine Learning and a Raspberry Pi. The wooden box shown here is actually a plug connected to a camera that you can use to program whatever is plugged into it. If it’s a lightbulb, for instance, you can record gestures with the external camera that turns the light on and off. The MacBook Pro shown here was merely used to show how quick it is to program!

For more on this experiment, check it out here.

Konterball

Play Ping-Pong with a friend through the Chrome browser with Konterball.

All I’ve ever wanted to do in life is play Ping-Pong without the danger of losing the little plastic ball. Playing it in virtual reality is the next best thing. Konterball is a VR-enabled mini-game that lets you play Ping-Pong with a virtual wall, or a friend with another VR-enabled device. Best of all, this particular experiment showcases Chrome’s new WebVR capabilities, which were officially announced this year at Google I/O.

For more on this experiment, check it out here.

Quick, Draw!

Draw — quickly! Or you’ll lose the game.

Have you had a chance to play with Auto Draw? This experiment uses Machine Learning to identify what it is you’re drawing on the screen. In this particular execution, you can play a multiplayer Pictionary-like game called Quick, Draw! with up to three people. Take heed: you really do have to draw quickly, or your score will suffer.

For more on this experiment, check it out here.

Giorgio Cam

Easily make music by taking pictures of objects.

This experiment is built with Machine Learning and it enables you to create a song by simply snapping a photo of an object. The app will then use image recognition to label what it sees, all while jamming out to a beat-laden Giorgio Moroder song in the background — hence the name Giorgio Cam. Giorgio Cam will give you lyrics, too, and it’s surprisingly not that bad at piecing together words into thoughtful prose.

For more on this experiment, check it out here.

The Spirit

This gorgeous experiment was hard to snap a photo of, but it uses an Intel Edison board and Firebase to power it.

The unfortunate glare hitting the Experiments dome made this particular experiment a bit hard to parse, which is a bummer — but that was only because I couldn’t see the full spectrum of color unfolding on screen. The Spirit enables you to use noise derivatives and Curl Noise to create weird, smokey shapes — think the smoke monster from LOST. It’s powered by an Intel Edison board running Android Things and does a majority of its processing using the Firebase cloud.

For more on the experiment, check it out here.

Spot the Bot

I feel really bad — I could hardly manage to help this man effectively spot the robot.

Looking for your next cooperative game? Spot the Bot is fairly simple in terms of game mechanics, but I can see it immediately becoming a popular party game for anyone who has Daydream View in the house.

Spot the Bot ostensibly requires a friend with another device to help you spot the robot by describing its features and characteristics. But the onus is on you to locate it in the virtual reality world. I can imagine that this game is especially fun now that you can stream your Daydream VR experience to the big screen.

For more on the experiment, check it out here.



Source link