Blog‎ > ‎

CrowdFunding promotion Night Terrors - Augmented Reality Survival Horror

posted Mar 11, 2015, 12:14 AM by J Shaw   [ updated Mar 11, 2015, 12:22 AM ]

A photorealistic, ultra immersive gaming experience that transforms your home into a terrifying, unfamiliar hellscape.
A photorealistic, ultra immersive gaming experience that transforms your home into a terrifying, unfamiliar hellscape.

About this project

$12,975

pledged of $140,000 goal
 
  • Night Terrors is a highly immersive, photorealistic, binaural, augmented reality survival horror game for mobile devices.  Gameplay takes place at home, after dark, with the lights off and your headphones on.  
  • The game controls what you see, what you hear, and where you go.  Your device's LED is all the light you get.  The camera and microphone feeds are analyzed and processed in real time.  Photorealistic elements are added to the camera feed.  Audio is spatialized, mixed with the microphone feed, and then routed to the headphones delivering an immersivebinaural audio experience.
  • Gameplay is deceptively simple.  You only have to do two things.
  • 1) Save the girl.
  • 2) Survive.
  • Figuring out how you'll do that is the real challenge.
  • Hint:  It doesn't involve sitting on your couch.
  •  Nightfall.
    The time has come.
    The signal grows stronger.
    Go. Save her life.
  • Night Terrors is built on four major algorithms. These algorithms work together to deliver an unmatched augmented reality experience.
  • The intensity of light radiating from a point source is inversely proportional to the square of the distance from that source.
    INVERSE² exploits this fact.
  • Perspective Correction - Gyroscope sensor data is used for preprocessing of the image.  Walls are maintained at a 90 degree angle to minimize mapping noise.
  • Point Source Modeling - The LEDs on mobile devices are not true point sources of light.  With an understanding of how they differ, they can be modeled as though they were.  
  • Environment Filtering - To build a clean map, intensity data must be collected only from the walls.  Walls are identified by exploiting basic assumptions and player movement.  Once identified, non-wall pixels are masked out.
  • Intensity Scanning - Gyroscope sensor data is used as a starting point. The location of the horizon is estimated and the ceiling and floor are approximated.  Walls are scanned.  Objects like paintings are ignored.
  • Depth Mapping - Intensity values are collected.  Pixel columns are averaged to reduce noise.  Intensity is converted into distance using known information about the camera lens, chip, exposure and LED.

    YouTube Video

  • Inertial Navigation System - Mobile device sensors are noisy!  Building an accurate inertial navigation system required taking a new approach.  Depth data is used to compute player position and movement.  The Position Ready Inertially Motivated Environment Scanner uses device sensor data to aid the search logic of an optical flow algorithm.
  • Search Logic - Data on the XY, and Z axes and data from the PitchRoll, and Yaw axes are used to approximate expected changes between frames.  This approximation guides the optical flow search window, and search time is drastically reduced.
  • Optical Flow - Player movement is computed by comparing the depth of identical points between two frames.  Every component in the device is brought together to create something special.  A map of the player’s home.  With it, the "impossible" becomes possible.

    YouTube Video

  • Practical vs CGI - Even with the benefit of rendering time, practical effects win out at the cinema for us.  3D models, rendered in real time on a mobile device, don't even come close to matching the real world. 
  • Photography - In Night Terrors, every augmented element is photographed using a unique process.  Elements are lit with a point source of light, effectively giving us a depth map for each element. 
  • Creating A Perfect Match - Compositing elements seamlessly into the frame requires matching the apparent lighting conditions for each element and the player's environment.   With control over the device LED, the lighting conditions of both element and environment can be manipulated to create a match.  
  • Our project video can be viewed in 720p HD below:

    YouTube Video

  •  Platforms:
    iPhone 5, 5c, 5s, 6, 6+ are guaranteed but we really want our game on every platform we can get it on.  We have iOS, Android and Windows Phone devices and like them all.  Android development will follow iOS, Windows Phone after that, hardware permitting.

Risks and challenges

Bryan here. Working this closely with pure evil, one always runs the risk of being eviscerated. If I actually lost both of my eyes, I really couldn't see us finishing the game.

Puns!

We are attempting to push the boundaries of a technology filled with unlimited possibilities shrouded by complexity. We have achieved a lot in the last 14 months. We eat, sleep and breathe this game and we believe in the future of this technology. We are committed to keeping our backers well informed because this journey is all of ours.

Developers solve problems, that's all we really do. We set a goal and then we fail over and over until the damn thing works the way we want it to. Good developers don't know how to give up. If you back this project, together we are going to make something special.