100 Days of VR: Day 2 Going through the Unity Space Shooter Tutorial I

We last left off going through the Unity Roll a ball tutorial. I’ve learned a lot about how to code in Unity, and how to use some of their existing API’s. Even more importantly, I learned how to navigate and use the Unity editor!

The editor is definitely different compared to Web or Mobile development where you create everything via code, but the editor definitely makes things easier!

Today in day 2, I started to look at the 2nd Unity lesson, theUnity Space Shooter, I’ve learned a lot about Unity from the previous tutorial, maybe even enough for me to start developing my own simple game (with a LOT of help from stackoverflow), but instead I’ve decided to solidify my foundations by going through more 1–2 more tutorials.

I didn’t get nearly as far as I did during Day 1 because of my job, but I did finish the First section:

Game setup, Player and Camera

Setting up the project

Unlike the previous tutorial where we just used simple materials and shapes, this time I had to use actual assets.

The first thing I had to do was download the assets that were provided from the tutorial.

I saw that when you create a new project in Unity. If you click on that tab, you can see the project, you can download it there (or from theUnity asset store)

And then when you create a new project, just make sure to add the asset package the resource:

After setting up the Unity tutorial, I found out that you can adjust the screen resolution of your game, by clicking on the Game and selecting the resolution options below it to set and create the resolution of your game.

So far so good, nothing too complex yet!

The Player GameObject

Now that I’ve setup the project, I’ve moved on to create the Player GameObject.

It’s nothing too different from what I’ve learned in the previous tutorial. Except this time, instead of creating a 3D sphere, I’m dragging an existing model asset provided to me.

Cool ship!

Some interesting things to note on this video

Mesh Collider, Mesh Filter, and Mesh Renderer

Looking at the components of the ship, we see that there are a Mesh Filter and a Mesh Renderer.

From a simple Google search:

Mesh FilterComponents take in a Mesh, which is a 3D object that is created with triangles (triangles, because it’s the most efficient shape to process in GUI programming).

The Mesh Filter will then create the shape of what your GameObject will look like based off of the mesh that you provide it.

Afterwards, if you want to see your model, you’ll need aMesh Rendererthis takes in your mesh and applies a material over it. The material being the skin of the ship you see above.

Finally, we have theMesh ColliderComponent that we add, that takes in a mesh. This allows us to create a collider that matches the exact shape of our ship.

It’s important to think about the gameplay vs. performance tradeoff that we have to make.

More fine-grain collider => Unity doing more work => Slower performance. However on the other hand, we might not want to create a cube collider over our ship.

If we do, we might collide with something on the front of the ship where there aren’t any meshes.

In a more fine grain control game like this space shooter, that would be bad.

Easy way to change your Game Transform values

Also on another un-related note, I found a nifty trick for the editor.

If you want to experiment with values, instead of manually putting the values in, you can actually left click on the properties circled in this picture

And then from there drag your mouse up and down to change the values. It’s much easier than just putting the values in manually!

Camera and Lighting

In the next video, we played around a bit with the camera and lighting.

Camera

In the camera component you can set the background. In our case, we just set a black background, making everything in the games tab black except for our ship.

Lighting

In the later Unity version, the project already comes with a Directional Light in the hierarchy, but these sources of light, as you expect, lights up your game!

The directional light is just light glowing down from a certain direction.

You can adjust things like the color of the light, how intense the light is, the direction the light comes from and many other things. However in this case, we just played with moving it around a bit and the intensity of the light to get our ship looking however we want it to look.

Adding a background

Creating a background is in a way very similar to adding a material to an existing shape object. For this project, we created aquad, which you can think of as a flat plane that you can’t change the Y scale at all.

We have an existing .tif file that’s provided for us to use to create an object, but our quad object only wants materials. How do we get this to work?

It turns out, if you were to drag the .tif file into the quad, it’ll automatically create the material for you that you can use.

Shaders

Another subject that was briefly talked upon are shaders. From my understanding, shaders are used for rendering effects on the material. It gives us the ability to apply brightness, colors, and other special effects on top of an existing image.

Luckily we don’t’ have to do learn how to do any of this (yet) and we can just use the existing ones that Unity provides us. Yay for frameworks!

In the video, we applied an Unlit Texture Shader affect to make our background image look brighter.

Moving the Ship

So finally now that we have our environment setup, we finally started to do some coding.

In this case, we’re using Mathf.Clamp to help set a position boundary so that if our position surpasses the minimum, it’ll stay at the minimum and if it surpasses the maximum, it’ll stay at the maximum.

Creating Shots

So now we have a ship and some movement, we moved on to something completely new, shooting things!

We create our bullet the same we create our background.

We make a Quad object and then use the .tif file to create a material (or just use the existing one) and attach the material.

We applied some shaders on it. We can optimize our performance by using mobile shaders instead of normal ones. Mobile shaders offer less controls, but as a result they’re less computationally intensive.

Finally we have the code that moves our bullet forward:

using UnityEngine;
using System.Collections;

public class Mover : MonoBehaviour
{
public float speed;

void Start ()
{
rigidbody.velocity = transform.forward * speed;
}
}

The code itself is pretty straight forward, we just set the velocity of our RigidBody to be some value and it’ll always go in that direction.

Shooting shots

Now that we’ve created a prefab of our bullet in the previous lesson, we can start working on shooting.

In the video, what was shown was that we created a new “spawner” GameObject.

A spawner is basically just an empty GameObject that we use its location to create new bullets.

The important thing here is to make the spawner a child of the player GameObject, this way it’ll always stay consistent relative to our ship.

In our script, we added a public variable for both the bullet prefab and the spawn point location.

Share this comment

Link to comment

Haha thanks. I say "100 days", but it's definitely not going to be 100 days in a row, I definitely don't have the bandwidth to do my day job, learn Unity at home, and then make a post about it everyday!

As the saying goes, "Shoot for the moon. Even if you miss, you'll land among the stars."

Similar Content

It's no secret that what makes or breaks a VR experience is whether or not the player feels immersed in the VR world you've built. Here are a collection of tested guidelines for creating a more captivating VR experience.

Introduction
In recent years, virtual reality (VR) technology has progressed exponentially to enable immersive environments in which users feel a heightened sense of realism—that “you’re really there” feeling in the created environment. Across the board, CPU performance, GPU performance, VR headsets’ visual fidelity, and VR-enabled software have all advanced tremendously.
Games are the most obvious beneficiaries of VR technology and are already beginning to make the most of it. Other software genres can benefit from VR’s immersive capabilities as well, including education, training, and therapeutic usages.
However, as with many new technologies, it’s easy to implement VR that looks cool on the surface but has fatal flaws that pull you out of the immersive experience or ultimately make you wonder why someone went to the trouble of creating the software. Developers run the risk of having an initial “Oh, wow!” quickly become “What’s the point?”
Read more

Arizona Sunshine* found success in the VR space after following Intel® Guidelines for Immersive VR Experiences. See how they became the fastest-selling non-bundled virtual Reality title to date.

With a dazzling launch in early 2017 that saw Arizona Sunshine* become the fastest-selling non-bundled virtual reality title to date, and instant recognition as the 2016 “Best Vive Game” according to UploadVR, the zombie-killer game is not just another VR shooter. Combining immersive game play with intriguing multi-player options, this game takes full advantage of VR capabilities to promote playability in both outdoor and underground environments.
Through its association with Netherlands-based Vertigo Games and nearby indie developer Jaywalkers Interactive, Intel helped add sizzle to Arizona Sunshine by fine-tuning the CPU capabilities to provide end-to-end VR realism. The power of a strong CPU performance becomes apparent with every jaw-dropping zombie horde attack. From the resources available when a player chooses and loads a weapon, to the responsiveness of the surrounding eerie world, the immersive qualities of the VR interface make it easy to forget that it’s just a game.
Read more

Hello everyone
I would like to introduce you to a game created by me and my friends.
Below is a link to the aforementioned game,
I would be grateful for feedback and advice regarding what we can add to the game or change.
This is our first game, so maybe it doesn't impress, but we hope you like it and then you like our fanpage.
Greetings
SandWitchStudio

The School of Information is accepting applications for the PhD program, for the incoming fall 2019 cohort. We are seeking students with interests in machine learning, natural language processing, text retrieval, virtual/mixed reality, or game development/design.
Preferred incoming doctoral students for the fall 2019 cohort will bring strong computational skills. Having advanced object-oriented programming skills, experience with machine learning toolkits such as scikit-learn or Keras, or experience with widely used real-time development platforms such as Unity, would be a plus. Instructional experiences are also beneficial, and should be highlighted in application materials, but are not required. Funded positions will be available for select graduate students, those students will receive tuition remission and a stipend in exchange for research- related, grant support, or instructional work.
The School of Information is an academic unit in the College of Social and Behavioral Sciences at the University of Arizona, Arizona’s only public land grant university. The School’s mission: “As Arizona’s iSchool, we collaborate across disciplines, drive critical research and development, and educate the information intellectuals and professionals of tomorrow to further positive social change that is rooted in the places where we live and that impacts the world.”
The school and the College of Social and Behavioral Sciences are dedicated to creating a serious, open, free intellectual space for inquiry, one in which faculty, students, staff members, and community partners can participate fully, regardless of race, ethnicity, gender identity, sexual orientation, age, socio-economic status, citizenship status, size, abled status, language, religion, or any other characteristic. Strategically positioned to lead the University of Arizona’s status as a Hispanic Serving Institution, SBS foregrounds our awareness of the deep ancestral footprint of our region’s populations and cultures, including the Tohono O’odham and Pascua Yaqui people whose lands we inhabit. We honor diverse knowledge traditions and lived experiences, and we strive to foster accessibility and equity, the conditions in which our members together can produce new, rigorous, urgent, and evidence-based knowledge
For more Information, please see: https://ischool.arizona.edu/phd-information.
With quick questions about the applicant process or application materials, feel free to reach out to our administrative support team through Barbara Vandervelde, barbv@email.arizona.edu.

After months of hard labour, cross words and more debates that you can shake a relatively large stick at, the Mixed Reality Toolkit finally entered the Beta phase. This marks the first “release” ready for developers to start prototyping, building projects and hacking away to build fantastic Mixed Reality experiences.
Featuring:
A new cross-platform architecture
Scriptable configuration
A new componentized layout
Support for WMR and OpenVR right out of the box (Just hit play)
And Much, much more.
It’s not at feature parity with its predecessor the HoloToolkit, but it showcases a new path / generation for the Toolkit, seeking to be more open, resolve old issues and strive to be the one toolkit to take on the world of Mixed Reality (or Cross-Reality if you prefer). It’s also not recommended for production use just yet (although legend tells, that it’s already being used for future projects by Microsoft and other MR Partners already).
Dig in, try it out, see what you think and let the team know Good or Bad what you like or would want to see improved.
Learn more about the Mixed Reality Toolkit here.