I am not involved in this project and I can’t provide a personal endorsement for their effort. Still, I thought it was interesting, and because this is the type of project that readers of Metaversing would tend to be interested in, I wanted to share this with you.

Technically, this isn’t the first Metaverse project to hit Kickstarter. That title probably belongs to either Surreal Adventures, or the VR Sandbox MMO called Voxelnauts. But this appears to be the first one that puts the Metaverse aspect of the project as something that is front and center, and not as an additional consideration to the side.

The traditional sales approach for a Metaverse project would show us a unique virtual world and sell backers on the sizzle of unique avatars, interesting environments, and a massively shared virtual environment. That isn’t what is happening here. They’re attempting to engage us… intellectually. Strange, isn’t it?

They start with their definition of a Metaverse, based upon the pillars of realism, ubiquity, interoperability, and scalability. They identify four major obstacles to include monetization, proprietary elements, lack of critical mass, and a premature focus on realism.

Their primary aim is to overcome the first two obstacles, which again are monetization and a proprietary platform. They propose a three year project for a core development team to work with backers to create a final project that would be released to the subject as open source.

Various levels of participation by the backers would result in increasing levels of influence in the project as well as rewards, such as a five year contract for land in the residential and commercial areas.

The part of their presentation that has generated the most interest begins at the 8:11 mark. They illustrate a live portal inside the world of Minecraft which would allow one to peer live into the world of Doom (and presumably, one in the world of Doom could peer in the reverse direction into the world of Minecraft). One might join their counterpart on the other side as simply as crossing the threshold. That appears to be the level of interoperability that this project hopes to enable.

The project itself is light on details, and it is something that I would like to have seen more of. Part of this may be because the goal of the project is to actually flesh out those details and implement them. Another reason may be the real threat of Brain Rape, a process where a venture capitalist or other developer seeks more disclosure, but only for the purpose of using that knowledge for themselves.

You can find out more about the project by visiting it on Kickstarter, and if you have specific questions for the project’s creator, you can always use the Contact Me link on the project’s main page.

I’m happy to see someone going in a different direction and breaking away the convention of a typical Metaverse project.

]]>https://metaversing.com/2016/03/16/internet-2021-a-metaverse-project-on-kickstarter/feed/2jmccormA Review of Earlier Articles… and a Return to Metaverse Issueshttps://metaversing.com/2015/04/20/a-review-of-earlier-articles-and-a-return-to-metaverse-issues/
https://metaversing.com/2015/04/20/a-review-of-earlier-articles-and-a-return-to-metaverse-issues/#respondMon, 20 Apr 2015 09:28:39 +0000http://metaversing.com/?p=2630Nine months ago, I wrote my last article on the Metaverse.

It was a short piece, mostly referencing an email from Fabian Giesen, a demoscene coder (and more) who was doing some VR work at Valve as a contractor. I’ll be honest, his message was a real downer for me, and I had my own Notch moment. Why was I working towards something that, if successful, would ultimately be used just to provide value to Facebook?

Over the past nine months, a surprising number of you have told me how those early Metaverse articles had actually been very helpful to you. A few of you said that you had a Metaverse effort going, but most of you were creating multiplayer virtual environments. Thank you all for your feedback and support!

I think the moment that it all crystallized and brought me back to Metaversing was seeing the return of Valve with the HTC Vive. Suddenly, it seemed like there were possibilities once again. Thanks, Gabe. I’m looking forward to learning more about your shared entertainment universe… perhaps a non-traditional Metaverse?

As I look back over last year’s body of work, I think most of the pieces have held up well enough. Perhaps the most controversial article was on the Virtual Home. The name, alone, drew an immediate comparison to PlayStation Home (closed in March, 2015), which turns out to be wildly unpopular with VR enthusiasts as the basis for a Metaverse implementation.

PlayStation Home was not where I was heading, so I can agree with much of the upset. Still, the article itself was far too ambitious. I tried to decompress way too many ideas into a short amount of space. I’ve learned my lesson — I’ll try to keep future articles more contained.

The PlayStation Home, now abandoned by Sony

What many of you may not have realized was that most of the articles from last year formed the discrete parts of a global design for a Metaverse. That Metaverse, ultimately, was never described in its entirety. I still have what appears to be a very unique blueprint for a Metaverse that I hope to describe in detail. I’m convinced that this model is not only viable (from multiple vantage points), but that it also has the ability to become wildly successful.

This year I intend to return to my work of laying down more of the design elements and then finally tying it all together. For now, I’ve got to see what happened to some illustrative artwork that was commissioned last year in JanusVR in support of an article I never published. It seems that some of the recent work by Valve (and now Oculus) has made that topic extremely relevant…

]]>https://metaversing.com/2015/04/20/a-review-of-earlier-articles-and-a-return-to-metaverse-issues/feed/0jmccormThe PlayStation Home, now abandoned by SonyValve’s Lighthouse as USB: Anything More than a Bunch of Spin?https://metaversing.com/2015/03/25/valves-lighthouse-as-usb-anything-more-than-a-bunch-of-spin/
https://metaversing.com/2015/03/25/valves-lighthouse-as-usb-anything-more-than-a-bunch-of-spin/#commentsWed, 25 Mar 2015 13:53:20 +0000http://metaversing.com/?p=1829

This is the third article in a series on the Valve/HTC Vive Ecosystem. If you you need additional context, please begin with the first article in the series.

Introduction

A famous quote from Gabe Newell is about a lesson that Valve learned early-on when dealing with the Internet. You can find it in Episode 306 of the Nerdist Podcast at 00:12:14.

Don’t ever, ever try to lie to the Internet because they will catch you. They will deconstruct your spin. The will remember everything you ever say for eternity. -Gabe Newell

At this year’s Game Developers Conference where Valve announced their Virtual Reality partnership with HTC, and at that time, Gabe made an incredible claim about the Lighthouse tracking technology:

So we’re gonna just give that away. What we want is for that to be like USB. It’s not some special secret sauce. It’s like everybody in the PC community will benefit if there’s this useful technology out there. -Gabe Newell (Valve)

The story which accompanies the interview describes Lighthouse as a way of providing infinite input solutions into Virtual Reality. “As long as tracking is there, anything can be brought into VR, like how USB ports enable you to plug (virtually) anything into your computer.”

What the Technology Brings

In the previous two articles, we’ve dug into the technology itself, and it supports what we’ve been told. Spend perhaps $100-150 for two of Valve’s Lighthouse units and mount them in opposite corners of the room. At that point, you can almost forget about them. But any enabled device that you bring into the room can take advantage of:

Rock-solid positional data with high precision and resolution

Rock-solid orientation data with high precision and resolution

Very low additional power use (passive sensors, undemanding electronics)

This support would be available for an arbitrary number of devices, and “at a low enough cost to be incorporated into consumer electronics items such as televisions, headsets, input devices, or mobile devices.”

Given Valve’s ambitions for the technology, it is expected that they will create a complete solution that will feed fully resolved positional and orientation data to an electronic device without the need for additional processing.

That last bit of functionality has yet to be confirmed. If not the case, the processing power required to compute the position and orientation is extremely lightweight. Valve may also have an additional solution for wireless connectivity back to a PC.

It is unclear if the default Lighthouse mode will support any identity features, but our review seems to suggest that it would be easy for Valve to enable the following functionality with a user-installed firmware update:

Ability to instantly identify a room and to distinguish it from others

Ability to give the room a unique identity to be used as a database key

More on the significance of these later in this article.

It is important to note that while this technology seems quite promising, it is still being developed. An early developer release is expected in the spring, and consumer release is slated for November of this year.

Commonly Suggested Uses

To be honest, the apparent uses (provided by Valve and speculated by third parties) are quite plausible, but by themselves don’t seem especially compelling:

Ability to find real-world objects in the room while you are still in VR

Solving robotic navigational issues

Now that we have finished our technical review in the previous two articles and have a better idea about the system and its capabilities, why don’t we try our own hand at developing some new features which can take advantage of it?

Room Scanning

If this isn’t going to be an upcoming feature for the HTC Vive, even for novelty’s sake, then the obvious has been missed. The concept of creating a depth map just from two images is very well known.

What would make the process even more robust is combining a camera of well known characteristics with the precision of Lighthouse tracking (providing known position and aim at all times). If not with a unique device built especially for that purpose, then we’re talking about the HTC Vive itself with built-in camera and tracking.

How might it work? It couldn’t be simpler: walk around the room and look at everything. The software will merge image stills or video with high resolution position and orientation data for camera. Once completed, it would process the images, determine the depth of elements which have been seen from multiple angles, and reconstruct the entire scene in three dimensions and display it in virtual reality.

Worth noting, the internal development version of the HTC Vive appears to have two cameras in front. One cannot help but wonder if they contemplated yet another method of 3D image acquisition, perhaps more appropriate for real-time processing?

Room scanning is something that might play well with Valve’s announced room-scaled VR, where you actually move around the physical room in tandem with your character moving in virtual reality. If you’re going to move around your living room, why not use it as the location for a virtual world at the same time? (Give some thought to how that might work. We’ll circle back around to it later in this article.)

What else might room scanning open the door for? Social engagements and playing games with friends and family in a familiar environment. It could serve as a wonderful bridge between virtual reality and augmented reality.

Object Scanning

This is similar to room scanning, but you would indicate to the software a specific item in the room. You would get up close to the item and slowly look all around it while the software reconstructs it before you in real-time. The software could automatically determine any holes in the model and prompt you as needed to inspect specific areas in more detail (or from other angles) to get a more complete picture.

Yet another version might take advantage of a special mode which could be made available in the Lighthouse system. While the first Lighthouse unit provides high resolution tracking information for your head mounted display or camera, the second Lighthouse could temporarily enter a second mode where a carefully strobed and swept infrared laser light assists the camera in constructing a high-definition model of your object.

Once created, your object could be imported into your virtual library which you could shared with others.

Augmented Reality

We touched on this briefly when covering room scanning, but this topic deserves serious consideration by itself. What if it was as simple as walking into a room with a Lighthouse enabled webcam, putting on your Lighthouse-enabled Augmented Reality glasses, and having a conversation with your aunt who is sitting on both your couch and her couch from 200 miles away?

Maybe you are like me and you never liked what you saw with augmented reality. So many startups are quick to promise yet unable to deliver these pie-in-the-sky aspirational tech demos which are little more than ridiculous techno-fantasies.

There is no way these things could even do the required computer-vision based processing to constantly track the images with the user’s changing head movement, not to mention have any idea where to place objects in the room or how to share the same content with others in the room.

Or is there?

The curious thing is that the Valve Lighthouse solves quite a number of augmented reality problems. Tracking directly solves the viewpoint problem, but what about places to project content or knowing who to share data with? That would be tied to the room identity features mentioned earlier.

Lighthouse-enabled AR glasses could be able to instantly identify the room they are in and distinguish it from others. The next time you or someone else walks into the room, any special information (such as pre-defined areas to project images onto) are referenced and download based on the Lighthouse ID number. When Lighthouse-assisted, your AR device can focus more of its limited resources on communications, content, and graphics.

Take another look at one of those aspirational augmented-reality videos from earlier this year and imagine a Lighthouse in every room. Now that you know more about Lighthouse, doesn’t this look less aspirational and more like a blueprint for something that could be available next year?

Here’s the funny thing: CastAR was founded by two ex-Valve employees that did not want to make the transition from Augmented Reality to Virtual Reality. Valve let them go, but they also let them take their AR technology with them. It might be a good time for someone to ask Jeri Ellsworth and Rick Johnson about Lighthouse.

Commercial Lighthouse Units and Augmented Reality

After making the connection between Lighthouse technology and Augmented Reality, I started to wonder how it would work in the commercial space. I’m not much of a creative type, so I’m going to play this one straight.

As you enter the front door, your pair of Lighthouse-enabled glasses automatically picked up the ID beacon off of an in-store Lighthouse unit. You have AR Beacon Roaming enabled, so your glasses looked up the beacon’s unique ID in an online database, and determined that the available scene is compatible with your hardware and is consistent with your filter settings. The scene is tied to a specific location in the store.

Curious, you walk over to the indicated area, and give your glasses permission to download and execute the scene over your wireless connection. Within moments, a lifelike, distinguished, tall man with white hair in a gray suit appears in your field of view. He addresses you from the speakers built in the end pieces of your glasses.

Okay, let’s stop there. I’m not going to blow any more of this article’s word budget on this particular scenario, and I think you might have some idea where it can go from there. Yes, such an experience could not only be interactive, but it could also independently complete a transaction with the user.

Lighthouse can mean the ability to authoritatively signal the availability of pre-defined content that is tied to location, and to enable augmented-reality glasses to better take advantage of it (by providing stable tracking that would far exceed what smart glasses might be able to do on their own).

Can you imagine some other uses? Museums, bakeries, real estate, self-service kiosks? Creative technical types might operate a public sandbox for like-minded individuals to come and show off their latest efforts in front of a live audience.

Perhaps this is a world that Valve explored and decided that it was best to leave this to others?

Other Potential Uses

Home automation (visualizing the state of your home and making changes) could benefit greatly from Lighthouse-enabled room scanning or augmented reality.

Devices could be created for the blind which allow them to see objects in a room using depth scanning (and if combined with Lighthouse identity, features and functions of the room could be indexed and tagged in remote databases).

Small sets of freestanding Lighthouse-enabled cameras with network connections could become popular. Two or more in the same room could be used to create movies where the scene can be reconstructed from many different arbitrary angles. With the right processing, an entire room or stage could be broadcast in virtual reality in real-time. Streaming performances.

What about using an enthusiast level PC to deliver next-generation augmented reality features in the home or office, with today’s technology? This might deserve an article in its own right, so the description here is going to be brief.

Combine the augmented reality features made available with Lighthouse (such as room identity and presets), PC-based room scanning and depth-mapping, PC-based processing and graphics power, the Vive head-mounted display, and the idea behind one pre-existing Jeri Ellsworth patent assigned to Valve which includes re-rendering a live camera feed with the same perspective as the human eye would see.

What do we have? Just as mobile augmented reality and Lighthouse made the CastAR video look possible, a PC-driven augmented reality system and Lighthouse could make last week’s fantasy “Just another day in the office” high concept demo look like a blueprint for next year’s technology.

The number of different things, both big and small, which Lighthouse enables is staggering. What are some of the uses that you can think of for Lighthouse?

Lighthouse in the Storm. Image Source: wallpaper-kid

Summary

So you run into a case where there something we think is really important, it is an abstract, but something we think is really important and we want to push in that direction. The reason why fans haven’t arrived at the same conclusions is because they don’t have the same data as us. –Erik Johnson (Valve) [55:50]

When Gabe Newell looks at virtual reality, he asks how long will it be stable? How long until a VR display is replaced by direct neural stimulation? “You just want to test to make sure that you’re not investing in something that’s fragile.” –Geoff Keighley interview with Gabe Newell (00:47:48).

When I look at Lighthouse, it is anything but fragile. It solves core issues in Virtual Reality with inputs and tracking and does not seem easily replaced. What I find surprising is that it seems to have solid practical applications that match with Valve’s core mission as much as it has additional applications that go well beyond anything that Valve seems to be interested in.

Is this another USB, a common standard that is picked up and used across the industry? It sure is starting to look that way. If Valve is offering to license the technology for free, there is a lot of promise in this new enabling technology.

Development on this product still needs to continue (as planned), but from all appearances, Lighthouse’s potential as a common technology is a claim that passes the spin test.

March 28th, 2015 – For the next few weeks, Alan Yates of Valve is taking questions on Lighthouse technology.

Edit 3/25/2015 – Corrected a doubled word and also the link for depth mapping. Thanks /u/Boffster.

]]>https://metaversing.com/2015/03/25/valves-lighthouse-as-usb-anything-more-than-a-bunch-of-spin/feed/1jmccormQuote from Gabe Newell's interview in The Nerdist Episode 306. Image source: unknownLighthouse in the Storm. Image Source: wallpaper-kidThe Future of Vehicular Safety Communications Systemshttps://metaversing.com/2014/04/19/the-future-of-vehicular-safety-communications-systems/
https://metaversing.com/2014/04/19/the-future-of-vehicular-safety-communications-systems/#respondSat, 19 Apr 2014 22:09:45 +0000http://metaversing.com/?p=879This article is going to stray a little off-topic. Before the end, I’m going to bring it back to the topic of augmented reality and virtual reality. I hope you find it interesting.

As mentioned in my previous article, I’m taking some graduate level computer security courses as part of my continuing education. Right now, I’m taking a quick break from my project which investigates security vulnerabilities in next generation vehicular safety systems. These systems, under development today, have been given the green light by the National Highway Traffic Safety Administration.

If things go as plan, your vehicle will send and receive reports on position, speed, heading (and more) to surrounding vehicles. The road may transmit information to you such as posted warning signs (curve ahead), weather conditions, or upcoming road hazards.

In the short term, this data will used to give important information to drivers as they are going down the road. (These systems might warn you that someone is running a stop sign in the intersection in front of you, or that a vehicle which is two cars ahead of you has slammed on their brakes.) In the long-term, this sets us on the path for self-driving vehicles.

So what does this have to do Virtual Reality or Augmented Reality? When you dive down into the specifications for this new automotive network, things start to look really cool. Take a look at this message specification.

Image Source: Dedicated Short Range Communications (DSRC) Message Set Dictionary Support Page at SAE International

That’s a descriptor for a road sign. It is giving us the position of the sign, the direction that the sign is facing, and the Federal Highway Administration’s code for what type of sign it is. Pretty cool, right?

The kinds of things it tells us about the surrounding vehicles is even cooler. Position, speed, heading, vehicle size, and much more. It even tells us if it is an emergency vehicle with its emergency lights on. All of this is refreshed up to ten times per second. You can find more specifications at SAE International’s website.

In addition to vehicles, signs, and weather, the network also transmits data on roads, lanes, and pedestrians. What can we do with all of this information? We can create either a virtual reality or an augmented reality representation of the world around us. From there, we can create applications which exist outside of the vehicular ad-hoc networks. I’m quite sure we can build things beyond just real-time vehicular safety. Useful things? Entertaining things?

One paper I found, Augmented Reality Driving Supported by Vehicular Ad Hoc Networking, explores the use of this data in Virtual Reality and Augmented Reality. It also presents the idea of using a virtual sound technology to make the driver more aware of their surroundings. This could be the start of some really interesting applications.

This is approximately where I was heading with a concept I briefly mentioned called the Real Home. The Real Home would be a representation of your actual home which works in either augmented reality or virtual reality. The idea is to merge real-world sensors and real-world controls and be able to integrate those controls into a real-world representation that could be used for a number of different purposes.

I recently saw a TED Talk on how to fool a GPS receiver. The more interesting part of his presentation is his prediction of low-cost millimeter-accurate GPS “dots” that could be attached to objects in your home. You could be able to track household items in both virtual reality and augmented reality.

What would you do with a virtual reality representation of the road around you? A representation of your own home?

Perhaps you’re surrounded by virtual reality enthusiasts. If so, you’re one of the lucky ones. For many of us, there are very few people that we can hold conversations with on the subject of virtual reality, yet alone the Metaverse. It is hard to find inspiration in a vacuum.

If you find yourself looking for ideas, you might consider the wealth of old books that are out there. They’re mostly from the 1990s. Sure, they’re a bit dated, but that isn’t all bad. You may be able to look at old ideas with a fresh perspective. Some of the old ideas have gone unreviewed, and are waiting for easy solutions and new applications.

Most of the value derived from these old titles may not be concepts that are directly pulled from the pages. They’re not going to say, “If you have a smartphone, then you might be able to use this technology and that.” Rather, they set the stage for you to think about similar problems in different ways.

Some months back, I filled a couple of shelves with virtual reality books for around $4-5 per title. Most of the cost was shipping. I located the majority of them on Amazon and eBay. If you find an interesting book in one source, try another — it may be cheaper. Also try searching for the title on Google and then clicking on the Shopping tab just under the search bar.

Here are a few more tips you may find helpful:

Use good search criteria. “virtual reality” seems to work the best. “cyberspace” titles are usually newer (2000+) and more often have to do with the topic of cybersecurity. “metaverse” will likely yield books about Second Life. “virtual worlds” can be mixed. If there are good titles with “avatar” in them, you’re going to have to comb through a lot of junk to find them. Be sure to limit your search to the book category.

Role-playing Games don’t appear to be a great source for ideas or implementations. You’d think they’d have some concrete ideas of how things might appear or be structured in the Metaverse. I went through a number of these, like GURPS Cyberpunk and Shadowrun titles. I really didn’t see much that was useful. In fact, these types of books probably aged far worse than anything else I acquired.

You probably want to avoid books that include a CD or are focused on coding (unless you’re writing a 3D engine from scratch). These will be badly dated. The only exception may be the titles on 3D positional audio, which have made little advance since the 1990s.