Experiment

bit.evolution, ‘Hatch Your Dinosaur!’ is an activity for children which focuses on creative play; play using creativity and imagination.

It was first launched at Maker Faire Bangkok, 2018. The project is currently being developed further for bit.studio’s future project (stay tuned!).

HOW IT WORKS

Each child gets given a sphere shaped dinosaur egg to draw and color their imaginations on. After drawing, the egg is scanned using multiple cameras, taking the pattern on the sphere and wrapping it on a 3D dinosaur model. The dinosaur models include Tyrannosauruses, Pteranodons, Apatosauruses, Triceratops.

The eggs which appear on the screen hatch into baby dinosaurs and grow into adult dinosaurs over time. This provide children with an element of surprise!

Moreover, children can give their dinosaurs personalities by recording their facial expressions and voices using facial motion tracking technology and voice alteration technology; the bigger the dinosaur the deeper the voice.

The experience allows children to interact with technology in a more creative manner, not only consuming the content, but creating it as well.

Is it possible for machines to feel emotions? As we know for now, machines have no emotion. But if we humans teach them, will they be able to learn? (A)I FEEL is a project dedicated to find answers to our questions by creating a teaching & learning process between humans and a machine. To teach the machine, it asks each user to draw a picture which represents a specific emotion. The picture will be recognized then memorized as one of representatives of the emotion. Then, to demonstrate what the machine has learned, the user can draw pictures to portray his/her emotion at the moment. The machine will interpret user’s emotion and colors the picture accordingly. The whole process will be visualized as an interactive installation which encourages people to participate in the project.

We present a shadow play system that displays in real-time the extrapolated motion sequences of the still shadow images casted by players. The crux of our system is a sequential version of the recently developed generative adversarial neural networks with a stack of LSTMs as the sequence controller. We train the model with preprocessed image sequences of diverse shadow figures’ motions, and use it in our enhanced shadow play system to automatically animate the shadows as if they were alive.

bit.bot is our latest experiment by our interns to create an adorable robot that can assist human withs various tasks such as drawing, cooking, or searching videos online. bit.bot is made with raspberry pi equipped with a tiny screen to let it exhibit various facial expressions, and a pico projector to display various videos to humans.
After saying the hotword “bitbot”, human can make queries which are then passed through the AI system. bit.bot then exhibits various facial expressions through the tiny screen and the result videos accordingly.

Unified Reality is a unified framework for virtual reality, augmented reality and projection mapping developed by Bit Studio. By connecting virtual, augmented, and physical world together, we introduce a shared experience platform which encourages different approaches of technology to interact with one another.

Virtual Reality (VR) is the technology that immerses human into the virtual world. For decades, VR devices have been rolled out to the market and have brought joy to people with its ability to take them anywhere they want. At the present, VR has proven to be the best choice for such immersive experiences, however, limited modes of user interaction has always been obstructing way to a better immersion. To make the experience even better, we knew we need to do something. This is where the story began.

At Bit Studio, various VR related researches have been done to improve user experiences and push the technology to its limit. Designed to work with HTC Vive® VR headsets, bit.lighthouse is also one of such researches. We aim to produce a tool to enhance the immersion of VR by bringing rich user interaction into the virtual world.

Among the flagship VR headsets in the market, HTC Vive® have the killer feature called “Room Scale Tracking” which was made possible by the technology named Lighthouse. Lighthouse allows a VR user to move freely in a cuboid space. Thanks to engineers in Vive’s team, it works by spreading and sweeping light beams across the entire room to aid the position tracking of the headset — thus the name Lighthouse. The position and pose of the headset is achieved by some calculations regarding the timing of those light beams. What’s surprising is that HTC allows developers to utilize this technology for free!

The project bit.lighthouse began by the end of year 2016, before Vive Tracker® was announced. At the date, many developers have been trying to hack Lighthouse’s protocol and publishing their works to the open-source community. Wanted to also make use of the Lighthouse, we joined the community and spent our effort in researching the technology. After that, we designed the hardware to fit our need and we ended up with our 1st prototype.

The 1st prototype was a floppy piece of hardware tangled with sensors and wires. A PU foam board were used as a mount for light sensors installed face-up from the board. At this stage, we used LAN cable to feed all the collected data to a computer to further process them. Our software engineer had built an algorithm to decode the data into something useful, i.e. pose and position, with the help of the well-known open-source computer vision library, OpenCV. After processing the data, they were then displayed in a 3D environment built with Unreal® engine.

After countless trials and errors on the 1st prototype, we were ready move on and build a more rigid hardware. In our 2nd prototype, we designed a custom PCB that can hold all the electronic modules within a more compact space. Powered by the ARM Cortex processor, we achieved a centimeter-level precision with only the data available from the lighthouse. A rechargeable battery was also added to the module and we started streaming collected data through Wi-Fi instead of a wired connection for better mobility.

This was the project we were very proud of until the announcement of Vive Tracker®. And so the story goes…

When we were young, playing with shadows could bring us way further than we saw. Those shadow puppets in our imagination can move and travel to any place. We aim to recreate and enhance this vivid experience with a mixture of art and technology. Turning the shadow casted from players’ hands into living animals that fly into the sky, shining bright as they turn into constellations.

Inspired by “The Three Little Pigs”, a famous fairy tale of our childhood, this project allows each player to take a role of a piglet in the story. Wearing the same top and having the player’s name above its head, the piglet starts a journey from the mother’s home into the village. Without its knowing, there is a big bad wolf lingering in the nearby forest waiting for a chance to catch its prey. It is up to the player to keep the piglet safe from becoming the wolf’s meal!