The Secret to iRacing’s Track Realism? LIDAR

The eNASCAR iRacing Pro Invitational Series is a welcome respite from a world transformed by coronavirus. Here’s how iRacing uses science to make the experience as much like real life as possible. If only the real world could figure out fast repairs!

Many NASCAR drivers use iRacing to supplement manufacturer-owned simulators, which they can access for limited periods of time. They rave about iRacing’s accuracy in everything from billboards to bumps in the track. The secret to iRacing’s fidelity to real life is LIDAR.

LIDAR? What’s a LIDAR?

LIDAR ‘sees’ with light waves the same way radar ‘sees’ with radio waves and sonar ‘sees’ with sound waves.

The term ‘LIDAR’ was originally a portmanteau of ‘light’ and ‘radar’. But ‘radar’ was originally an acronym for RAdio Detection and Ranging. So some people say LIDAR is an acronym for LIght Detection And Ranging.

LIDAR was commercially introduced in the 1960s, shortly after the discovery of the laser. As the size and price of lasers (and detectors) has come down, LIDAR has become more accessible and applicable to a range of activities

  • Forestry (Soil profiling, studying the growth of tree stands, forest fire management)
  • Space exploration (studying the atmosphere on Mars; during the Apollo 15 mission to map the moon)
  • Flood modeling and disaster recovery
  • Natural resource exploration and recovery (Oil and gas exploration, mining)
  • Archeology (Indiana Jones should’ve had one)
  • Construction
  • NASCAR inspections

The Physics of Sight

Any type of ‘seeing’ requires three things:

  • A source
  • An object
  • A detector

For example: light waves from the Sun (the source) reflect from a piece of paper (an object) and are detected by your eyes.

How humans see, including the source (the Sun), an object (a red block) and the detector (an eye)

We derive a lot of information from that simple process.

Color

‘White’ light contains many different colors, as you know if you’ve ever played with a prism or seen a rainbow. If you see the object as red, that’s because that particular object absorbs all colors of light except red.

We see color because some wavelengths of light are absorbed and some reflected. We only see the reflected ones.

Shape

If light waves hit a smooth surface, they reflect back as I showed above. But if they hit something like a rumble strip, the shape of the object changes how the waves reflect.

Light waves reflect differently from different shapes and textures. Our brains takes this scattering into account.

On a smaller scale, this same principle gives us an object’s texture.

When you walk into a track and look around, your eyes convert those light waves to electrical signals. Your brain analyzes the light waves coming off everything in your field of view and creates a 3D map that tells us how far away objects are, their shapes, textures, and colors.

Our eyes and brains are really quite amazing — which is why it’s a challenge to create machines that ‘see’.

The Electromagnetic Spectrum

Different colors of light have different wavelengths. A rainbow is ordered in decreasing wavelength from red to violet. Red light has a wavelength of about 700 nanometers (that’s about 1/100th of the diameter of a human hair). Violet light is about 400 nanometers.

Colors at the BIV end of the rainbow have shorter wavelengths, while colors at the ROY end have longer wavelengths.
Colors at the BIV end of the rainbow have shorter wavelengths, while colors at the ROY end have longer wavelengths.

The human eye can detect wavelengths from about 380 nm to 740 nm, but that’s a tiny, tiny, tiny part of the electromagnetic spectrum.

SiriusXM works at about 2.3 GHz, which is a wavelength of roughly 23 feet.

Each tick on the graph above is a factor of ten. Electromagnetic waves exist with wavelengths from 0.000000000001 meters to 1,000,000,000,000,000 meters. I noted where satellite radio waves fall just because I was curious.

Infrared and ultraviolet waves are waves slightly larger than or smaller than the range we can see. You can ‘see’ infrared waves with night vision goggles. Some animals see ultraviolet waves.

Seeing Without Light

Regardless of what wavelength light you’re using, the process is the same. There’s a source, an object and a receiver. Because you’re working outside the visible light spectrum, the machine must both detect and send the waves you’re using.

  • How long it takes for the light pulse to bounce off the object and return tells you how far away it is.
  • How much of the wave you sent out comes back tells you something about the color.
  • You can tell something about the texture of the object by comparing the incident wave to the reflected wave.

LIDAR

Radar uses radio waves, which are huge. That’s fine if you’re detecting airplanes and cloud formations. If you want to see smaller features, you need smaller waves.

  • LIDAR doesn’t use visible light
    • A visible light source would have to be extremely powerful to be seen against the background of ambient light.
    • You would have to account for the ambient light already there
    • You risk blinding someone with an incredibly intense light.
  • LIDAR uses infrared waves
    • Most LIDAR systems work in the near-infrared, either at 905 nm or 1550 nm.
    • That’s above the human eye’s limit, but infrared light can still damage eyes. Longer wavelengths have less energy, and the 1550 nm light is in a very specific spot that our eyes don’t pick up — but the technology to work in that range is more expensive.
    • This wavelength gives LIDAR millimeter-level resolution. You’re not just detecting bumps, you’re detecting small cracks and the texture of the asphalt.
  • LIDAR uses a laser as a source. A laser produces a single wavelength, which makes the measurement more accurate.
The LIDAR rig is portable and the resolution allows it to detect painted lines on the asphalt.
Left: A LIDAR rig being set up at a track.
Right: LIDAR is so sensitive, it can detect painted lines on the track.

How They Do It

  • It takes time The iRacing team scans about 300 feet at a time, which takes about an hour. The rotating LIDAR captures everything in a 360-degree circle.
  • It produces a massive set of numbers. The result is a 3D point cloud that captures the reflectivity, texture and shape of every object in the scan. A road course (like COTA at 3.4 miles) needs more than 250 million points.
A point cloud in false color
You can see how the reflectivity of the green trees is much different from the reflectivity of the track.

The point cloud is just a set of numbers. LIDAR doesn’t measure ‘green’, but rather measures the reflectivity of the object to infrared waves. The points are then mapped to real colors. Here’s an expanded version that lets you see the density of points.

A closer view of the 3D LIDAR point cloud
The black circle is where the scanner was located. They’ll do overlapping scans to fill in those areas.

iRacing supplements the LIDAR scan with

  • Thousands of photos that are used to fine-tune the computer model.
  • GPS data that will be used to situate the track among the surrounding terrain

It takes about 13,000 person-hours of time to get from the point cloud to a raceable track. Luckily, they divide and conquer, because 13,000 person-hours is about a year-and-a-half of someone working 24/7.

LIDAR Allows for Dynamics, Too

Having a detailed characterization of the track allows the programmers to make the track change in response to things like weather and friction heating from the cars’ tires.

  • By constantly calculating the position of the Sun throughout the race, they can put appropriate regions of the track in shade. They can manipulate the weather and make the track respond to cloud cover.
  • Darker areas on a track heat up more than lighter areas. Heat means less grip. I covered this on the blog about why concrete and asphalt race differently
  • The car’s lines feed back into the track temperature. If everyone’s racing one groove, that area of the track will heat up more than the rest of the track. Just as in real life, moving to another groove might get you more grip.

Coming Soon To A Roadway Near You

Autonomous cars use LIDAR to navigate, in addition to multiple cameras. The box you see on top of an autonomous car holds the LIDAR. The car’s computer must gather all the data and synthesize it in real time.

Left: An autonomous car showing the LIDAR box on top
Right: A graphic showing how an autonomous car would use a combination of cameras and LIDAR to detect everything around it.
Left: An autonomous car showing the LIDAR box on top
Right: A graphic showing how an autonomous car would use a combination of cameras and LIDAR to detect everything around it.

A Personal Note

I finished my first novel last year. It’s about a journalist who gets entangled in a plot to destabilize the global economy using an Ebola-like bioweapon. I spent a lot of time studying viruses as background research for that book. I learned a lot of things I never dreamed I’d actually have to apply in real life.

Please listen to CDC and NIH doctors and scientists. Follow their guidance. You don’t go to a plumber to get a root canal. Don’t get medical advice from politicians, pundits, or social media. That includes celebrity doctors more interested in advancing their personal brand than telling the truth.

Above all, trust Dr. Fauci.

Follow the guidelines. Keep your distance from people. Get exercise and, if possible, get outdoors for the sake of your mental health.

Ask for help if you need it.

Give help if you can.

Be the first to comment

Leave a Reply

Your email address will not be published.


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.