HTML5 appears to have a number of benefits for consumers and car manufacturers. But what is often good for the goose is not necessarily good for the developer. Talking to the guys in the trenches is critical to understanding the true viability of HTML5.
Andy Gryc and Sheridan Ethier, manager of the automotive development team at QNX, pair up for a technical discussion on HTML5. They explore whether this new technology can support rich user interfaces, how HTML5 apps can be blended with apps written in OpenGL, and if interprocess communication can be implemented between native and web-based applications.
So without further ado, here’s the latest in the educational series of HTML5 videos from QNX.
This interview of Sheridan Ethier is the third in a series from QNX on HTML5.
Showing posts with label QNX OS. Show all posts
Showing posts with label QNX OS. Show all posts
Is HTML5 a good gamble?
As the consumer and automotive worlds continue to collide, HTML5 looks like a good bet. And not a long shot either. In fact, the odds are all automakers will eventually use it. But since the standard won’t be mature for some time yet, should you take a chance on it now?
To answer this, Andy Gryc talks to Matthew Staikos of RIM. Matthew is the manager of the browser and web platform group at RIM, and has over 10 years of software development experience with a strong focus on WebKit for mobile and embedded systems. Matthew co-founded Torch Mobile, which was acquired by RIM for their browser technology.
Andy’s conversation with Matthew is the subject of the following video, the second in an educational series designed to get an industry-wide perspective on HTML5.
This interview of Matthew Staikos is the second in a series from QNX on HTML5.
What’s HTML5 got to do with automotive?
There’s been a lot of noise lately about HTML5. A September 2011 report by binvisions shows that search engines and social media web sites are leading the way toward adoption: Google, Facebook, YouTube, Wikipedia, Twitter, and plenty more have already transitioned to HTML5. Some are taking it even further: Facebook has an HTML5 Resource Center for developers and the Financial Times has a mobile HTML5 version of their website.
It won’t be long before HTML5 is ubiquitous. We think automakers should (and will) use it.
To elucidate the technology and its relevance, we’ve created a series of educational videos on the topic. Here is the first in that series. Interviews with partners, customers, and industry gurus will soon follow.
This simple overview is the first in a series from QNX on HTML5. (Personally I like the ending the best.)
Beyond the dashboard: discover how QNX touches your everyday life
QNX technology is in cars — lots of them. But it’s also in everything from planes and trains to smart phones, smart buildings, and smart vacuum cleaners. If you're interested, I happen to have an infographic handy...
I was a lost and lonely soul. Friends would cut phone calls short, strangers would move away from me on the bus, and acquaintances at cocktail parties would excuse themselves, promising to come right back — they never came back. I was in denial for a long time, but slowly and painfully, I came to the realization that I had to take ownership of this problem. Because it was my fault.
To by specific, it was my motor mouth. Whenever someone asked what I did for a living, I’d say I worked for QNX. That, of course, wasn’t a problem. But when they asked what QNX did, I would hold forth on microkernel OS architectures, user-space device drivers, resource manager frameworks, and graphical composition managers, not to mention asynchronous messaging, priority inheritance, and time partitioning. After all, who doesn't want to learn more about time partitioning?
Well, as I subsequently learned, there’s a time and place for everything. And while my passion about QNX technology was well-placed, my timing was lousy. People weren’t asking for a deep dive; they just wanted to understand QNX’s role in the scheme of things.
As it turns out, QNX plays a huge role, and in very many things. I’ve been working at QNX Software Systems for 25 years, and I am still gobsmacked by the sheer variety of uses that QNX technology is put to. I'm especially impressed by the crossover effect. For instance, what we learn in nuclear plants helps us offer a better OS for safety systems in cars. And what we learn in smartphones makes us a better platform supplier for companies building infotainment systems.
All of which to say, the next time someone asks me what QNX does, I will avoid the deep dive and show them this infographic instead. Of course, if they subsequently ask *how* QNX does all this, I will have a well-practiced answer. :-)
Did I mention? You can download a high-res JPEG of this infographic from our Flickr account and a PDF version from the QNX website.

Stay tuned for 2015 CES, where we will introduce even more ways QNX can make a difference, especially in how people design and drive cars.
And lest I forget, special thanks to my colleague Varghese at BlackBerry India for conceiving this infographic, and for the QNX employees who provided their invaluable input.
I was a lost and lonely soul. Friends would cut phone calls short, strangers would move away from me on the bus, and acquaintances at cocktail parties would excuse themselves, promising to come right back — they never came back. I was in denial for a long time, but slowly and painfully, I came to the realization that I had to take ownership of this problem. Because it was my fault.
To by specific, it was my motor mouth. Whenever someone asked what I did for a living, I’d say I worked for QNX. That, of course, wasn’t a problem. But when they asked what QNX did, I would hold forth on microkernel OS architectures, user-space device drivers, resource manager frameworks, and graphical composition managers, not to mention asynchronous messaging, priority inheritance, and time partitioning. After all, who doesn't want to learn more about time partitioning?
Well, as I subsequently learned, there’s a time and place for everything. And while my passion about QNX technology was well-placed, my timing was lousy. People weren’t asking for a deep dive; they just wanted to understand QNX’s role in the scheme of things.
As it turns out, QNX plays a huge role, and in very many things. I’ve been working at QNX Software Systems for 25 years, and I am still gobsmacked by the sheer variety of uses that QNX technology is put to. I'm especially impressed by the crossover effect. For instance, what we learn in nuclear plants helps us offer a better OS for safety systems in cars. And what we learn in smartphones makes us a better platform supplier for companies building infotainment systems.
All of which to say, the next time someone asks me what QNX does, I will avoid the deep dive and show them this infographic instead. Of course, if they subsequently ask *how* QNX does all this, I will have a well-practiced answer. :-)
Did I mention? You can download a high-res JPEG of this infographic from our Flickr account and a PDF version from the QNX website.

Stay tuned for 2015 CES, where we will introduce even more ways QNX can make a difference, especially in how people design and drive cars.
And lest I forget, special thanks to my colleague Varghese at BlackBerry India for conceiving this infographic, and for the QNX employees who provided their invaluable input.
QNX-based nav system helps Ford SUVs stay on course down under
![]() |
Paul Leroux |
To reduce driver distraction, the system offers a simplified user interface and feature set. And, to provide accurate route guidance, the system uses data from an internal gyroscope and an external traffic message channel, as well as standard GPS signals. Taking the conditions of local roads into account, the software provides a variety of alerts and speed-camera warnings; it also offers route guidance in Australian English.
The navigation system is based on the iGO My way Engine, which runs in millions of navigation devices worldwide. To read NNG's press release, click here.

SWSA's new nav system for the Ford Territory is based on the Freescale
i.MX31L processor, QNX Neutrino RTOS, and iGO My way Engine.
The need for green in automotive

- “The personal automobile is the single greatest polluter as emissions from millions of vehicles on the road add up.”— US Environmental Protection Agency
Working at QNX has given me insight into just how complex the problem is and how going green in automotive is not going to be a revolution. I've come to realize that it will require a good number of players on a large number of fronts.
![]() |
An example of what happens when your car takes way too long to boot. :-) |
To prevent such undignified delays, these systems typically do not power down completely. Instead, they suspend to RAM while the vehicle is off. This lets the system boot ‘instantly’ whenever the ignition turns over. But because there’s a small current draw to keep RAM alive, this trickle continually drains the battery. This might have minimal consequences today (other than cost to the manufacturer, which is a whole other story) but in the brave new world of electric and hybrid cars, battery capacity equals mileage. Typical systems thus shorten the range of green vehicles and, in the case of hybrids, force drivers to use not-so-green systems more often. More importantly perhaps, these systems give would-be buyers ‘range anxiety’. Indeed, according to the Green Market’s Richard Matthews, battery life is one of the top reasons the current adoption rate is so low.
A little-known feature of QNX technology helps solve this problem.
Architects using the QNX OS can organize the boot process to bring up complex systems in a matter of seconds. Ours is not an all-or-nothing proposition as it is with monolithic operating systems that must load an entire system before anything can run – Windows and Linux are prime examples. QNX supports a gradual phasing in of system functionality to get critical systems up and running while it loads less-essential features in the background. A QNX-based system can start from a cold boot every time. Which means no battery drain while the car is off.
And while this is no giant leap for mankind it is certainly a solid step in the right direction. If the rest of us (consumers, that is) contributed similarly by trading in our clunkers for greener wheels, the industry could undoubtedly move forward in leaps and bounds. I suppose this means I’m going to have to take a long hard look at my 2003 Honda Civic.
Enabling the next generation of cool
Capturing QNX presence in automotive can’t be done IMHO without a nod to our experience in other markets. Take, for example, the extreme reliability required for the International Space Station and the Space Shuttle. This is the selfsame reliability that automakers rely on when building digital instrument clusters that cannot fail. Same goes for the impressive graphics on the BlackBerry Playbook. As a result, Tier1s and OEMs can now bring consumer-level functionality into the vehicle.
Multicore is another example. The automotive market is just starting to take note while QNX has been enabling multi-processing for more than 25 years.
So I figure that keeping our hand in other industries means we actually have more to offer than other vendors who specialize.
I tried to capture this in a short video. It had to be done overnight so it’s a bit of a throw-away but (of course) I'd like to think it works. :-)
Multicore is another example. The automotive market is just starting to take note while QNX has been enabling multi-processing for more than 25 years.
So I figure that keeping our hand in other industries means we actually have more to offer than other vendors who specialize.
I tried to capture this in a short video. It had to be done overnight so it’s a bit of a throw-away but (of course) I'd like to think it works. :-)
A sweet ride? You’d better 'beleave' it
Is Autumn the best season for a long, leisurely Sunday drive? Well, I don’t know about your neck of the woods, but in my neck, the trees blaze like crimson, orange, and yellow candles, transfiguring back roads into cathedrals of pure color. When I see every leaf on every tree glow like a piece of sunlight-infused stained glass, I make a religious effort to jump behind the wheel and get out there!
Now, of course, you can enjoy your Autumn drive in any car worth its keep. But some cars make the ride sweeter than others — and the Mercedes S Class Coupe, with its QNX-powered infotainment system and instrument cluster, is deliciously caloric.
This isn’t a car for the prim, the proper, the austere. It’s for pure pleasure – whether you take pleasure in performance, luxury, or beauty of design. Or all three. The perfect car, in other words, for an Autumn drive. Which is exactly what the folks at Mercedes thought. In fact, they made a photo essay about — check it out on their Facebook page.

Source: Mercedes
Now, of course, you can enjoy your Autumn drive in any car worth its keep. But some cars make the ride sweeter than others — and the Mercedes S Class Coupe, with its QNX-powered infotainment system and instrument cluster, is deliciously caloric.
This isn’t a car for the prim, the proper, the austere. It’s for pure pleasure – whether you take pleasure in performance, luxury, or beauty of design. Or all three. The perfect car, in other words, for an Autumn drive. Which is exactly what the folks at Mercedes thought. In fact, they made a photo essay about — check it out on their Facebook page.

Source: Mercedes
Marking over 5 years of putting HTML in production cars

For me, the realization occurred 11 years ago, when I signed up with QNX Software Systems. QNX was already connecting devices to the web, using technology that was light years ahead of anything else on the market. For instance, in the late 90s, QNX engineers created the “QNX 1.44M Floppy,” a self-booting promotional diskette that showcased how the QNX OS could deliver a complete web experience in a tiny footprint. It was an enormous hit, with more than 1 million downloads.
![]() |
Embedding the web, dot com style: The QNX-powered Audrey |
At the time, Don Fotsch, one of Audrey’s creators, coined the term “Internet Snacking” to describe the device’s browsing environment. The dot com crash in 2001 cut Audrey’s life short, but QNX maintained its focus on enabling a rich Internet experience in embedded devices, particularly those within the car.
The point of these stories is simple: Embedding the web is part of the QNX DNA. At one point, we even had multiple browser engines in production vehicles, including the Access Netfront engine, the QNX Voyager engine, and the OpenWave WAP Browser. In fact, we have had cars on the road with Web technologies since model year 2006.
With that pedigree in enabling HTML in automotive, we continue to push the envelope. We already enable unlimited web access with full browsers in BMW and other vehicles, but HTML in automotive is changing from a pure browsing experience to a full user experience encompassing applications and HMIs. With HTML5, this experience extends even to speech recognition, AV entertainment, rich animations, and full application environments — Angry Birds anyone?
People often now talk about “App Snacking,” but in the next phase of HTML 5 in the car, it will be "What’s for dinner?”!
Harman infotainment systems gear up with QNX technology
![]() |
Paul Leroux |
Mind you, Harman isn't just about the high end. They also offer a scalable infotainment platform that can target both higher-end and lower-end vehicles. And they aren't just about European cars, either. Earlier this year, they became the first non-Japanese supplier to supply an infotainment system (the QNX-based Entune system) to Toyota. They also supply systems to Hyundai, Lexus, Subaru, and Ssangyong.
Since 2003, Harman has used the QNX OS as the software platform for its infotainment products. (In fact, Harman owned QNX Software Systems for about 5 years, before QNX became a subsidiary of RIM.) In this video, Rick Kreifeldt, Harman's VP of global automotive research and innovation, discusses how QNX technology and expertise help Harman cut time-to-market and create greener products. Check it out:
A version of this post originally appeared on the On Q blog.
Some forward-thinking on looking backwards
The first rear-view camera appeared on a concept car in 1956. It's time to go mainstream.
Until today, I knew nothing about electrochromism — I didn’t even know the word existed! Mind you, I still don’t know that much. But I do know a little, so if you’re in the dark about this phenomenon, let me enlighten you: It’s what allows smart windows to dim automatically in response to bright light.
A full-on technical explanation of electrochromism could fill pages. But in a nutshell, electrochromic glass contains a substance, such as tungsten trioxide, that changes color when you apply a small jolt of electricity to it. Apply a jolt, and the glass goes dark; apply another jolt, and the glass becomes transparent again. Pretty cool, right?
Automakers must think so, because they use this technology to create rear-view and side-view mirrors that dim automatically to reduce glare — just the thing when the &*^%$! driver behind you flips on his high-beams. Using photo sensors, these mirrors measure incoming light; when it becomes too bright, the mirror applies the requisite electrical charge and, voilà , no more fried retinas. (I jest, but in reality, mirror glare can cause retinal blind spots that affect driver reaction time.)
So why am I blabbing about this? Because electrochromic technology highlights a century-old challenge: How do you see what — or who — is behind your car? And how do you do it even in harsh lighting conditions? It’s a hard problem to solve, and it’s been with us ever since Dorothy Levitt, a pioneer of motor racing, counseled women to “hold aloft” a handheld mirror “to see behind while driving.” That was in 1906.
Kludges
For sure, we’ve made progress over the years. But we still fall back on kludges to compensate for the inherent shortcomings of placing a mirror meters away from the back of the vehicle. Consider, for example, the aftermarket wide-angle lenses that you can attach to your rear window — a viable solution for some vehicles, but not terribly useful if you are driving a pickup or fastback.
Small wonder that NHTSA has ruled that, as of May 2018, all vehicles under 10,000 pounds must ship with “rear visibility technology” that expands the driver’s field of view to include a 10x20-foot zone directly behind the vehicle. Every year, backover crashes in the US cause 210 fatalities and 15,000 injuries — many involving children. NHTSA believes that universal deployment of rear-view cameras, which “see” where rear-view mirrors cannot, will help reduce backover fatalities by about a third.
Buick is among the automotive brands that are “pre-complying” with the standard: every 2015 Buick model will ship with a rearview camera. Which, perhaps, is no surprise: the first Buick to sport a rearview camera was the Centurion concept car, which debuted in 1956:

1956 Buick Centurion: You can see the backup camera just above the center tail light.
The Centurion’s backup camera is one of many forward-looking concepts that automakers have demonstrated over the years. As I have discussed in previous posts, many of these ideas took decades to come to market, for the simple reason they were ahead of their time — the technology needed to make them successful was too immature or simply didn’t exist yet.
Giving cameras the (fast) boot
Fortunately, the various technologies that enable rear-view cameras for cars have reached a sufficient level of maturity, miniaturization, and cost effectiveness. Nonetheless, challenges remain. For example, NHTSA specifies that rear-view cameras meet a number of requirements, including image size, response time, linger time (how long the camera remains activated after shifting from reverse), and durability. Many of these requirements are made to order for a platform like the QNX OS, which combines high reliability with very fast bootup and response times. After all, what’s the use of backup camera if it finishes booting *after* you back out of your driveway?

Instrument cluster in QNX technology concept car displaying video from a backup camera.
Until today, I knew nothing about electrochromism — I didn’t even know the word existed! Mind you, I still don’t know that much. But I do know a little, so if you’re in the dark about this phenomenon, let me enlighten you: It’s what allows smart windows to dim automatically in response to bright light.
A full-on technical explanation of electrochromism could fill pages. But in a nutshell, electrochromic glass contains a substance, such as tungsten trioxide, that changes color when you apply a small jolt of electricity to it. Apply a jolt, and the glass goes dark; apply another jolt, and the glass becomes transparent again. Pretty cool, right?
Automakers must think so, because they use this technology to create rear-view and side-view mirrors that dim automatically to reduce glare — just the thing when the &*^%$! driver behind you flips on his high-beams. Using photo sensors, these mirrors measure incoming light; when it becomes too bright, the mirror applies the requisite electrical charge and, voilà , no more fried retinas. (I jest, but in reality, mirror glare can cause retinal blind spots that affect driver reaction time.)
So why am I blabbing about this? Because electrochromic technology highlights a century-old challenge: How do you see what — or who — is behind your car? And how do you do it even in harsh lighting conditions? It’s a hard problem to solve, and it’s been with us ever since Dorothy Levitt, a pioneer of motor racing, counseled women to “hold aloft” a handheld mirror “to see behind while driving.” That was in 1906.
Kludges
For sure, we’ve made progress over the years. But we still fall back on kludges to compensate for the inherent shortcomings of placing a mirror meters away from the back of the vehicle. Consider, for example, the aftermarket wide-angle lenses that you can attach to your rear window — a viable solution for some vehicles, but not terribly useful if you are driving a pickup or fastback.
Small wonder that NHTSA has ruled that, as of May 2018, all vehicles under 10,000 pounds must ship with “rear visibility technology” that expands the driver’s field of view to include a 10x20-foot zone directly behind the vehicle. Every year, backover crashes in the US cause 210 fatalities and 15,000 injuries — many involving children. NHTSA believes that universal deployment of rear-view cameras, which “see” where rear-view mirrors cannot, will help reduce backover fatalities by about a third.
Buick is among the automotive brands that are “pre-complying” with the standard: every 2015 Buick model will ship with a rearview camera. Which, perhaps, is no surprise: the first Buick to sport a rearview camera was the Centurion concept car, which debuted in 1956:

1956 Buick Centurion: You can see the backup camera just above the center tail light.
The Centurion’s backup camera is one of many forward-looking concepts that automakers have demonstrated over the years. As I have discussed in previous posts, many of these ideas took decades to come to market, for the simple reason they were ahead of their time — the technology needed to make them successful was too immature or simply didn’t exist yet.
Giving cameras the (fast) boot
Fortunately, the various technologies that enable rear-view cameras for cars have reached a sufficient level of maturity, miniaturization, and cost effectiveness. Nonetheless, challenges remain. For example, NHTSA specifies that rear-view cameras meet a number of requirements, including image size, response time, linger time (how long the camera remains activated after shifting from reverse), and durability. Many of these requirements are made to order for a platform like the QNX OS, which combines high reliability with very fast bootup and response times. After all, what’s the use of backup camera if it finishes booting *after* you back out of your driveway?

Instrument cluster in QNX technology concept car displaying video from a backup camera.
Domo arigato, for self-driving autos
![]() |
Lynn Gayowski |
Let's begin at the beginning. Obviously the first step is to watch the 1983 Mr. Roboto music video. To quote selectively, "I've come to help you with your problems, so we can be free." As Styx aptly communicated with the help of synthesizers, robots have the potential to improve our lives. Current research predicts autonomous cars will reduce traffic collisions and improve traffic flow, plus drivers will be freed up for other activities.
So let's take a look at how QNX has been participating in the progress to self-driving vehicles.
The microkernel architecture of the QNX operating system provides an exemplary foundation for systems with functional safety requirements, and as you can see from this list, there are projects related to cars, underwater robots, and rescue vehicles.
Take a look at this 1997 video from the California Partners for Advanced Transportation Technology (PATH) and the National Automated Highway System Consortium (NAHSC) showing their automated driving demo — the first project referenced on our timeline. It's interesting that the roadway and driving issues mentioned in this video still hold true 17 years later.
We're estimating that practical use of semi-autonomous cars is still 4 years away and that fully autonomous vehicles won't be available to the general public for about another 10 years after that. So stay tuned to the QNX Auto Blog. I'm already envisioning a 30-year montage of our autonomous projects. With a stirring soundtrack by Styx.
QNX-powered Audi MMI framework to support Android Auto
This just in: Audi has announced that its Audi MMI mobile media application framework, which is built on the QNX CAR Platform for Infotainment, will support the new Android Auto connectivity solution.
The new feature will allow drivers to access Android-device car apps using Audi MMI displays and controls, which Audi has optimized for safe and intuitive operation on the road.
Audi states that the MMI system will still maintain its compatibility with other smartphones. Moreover, drivers will be able to switch between the Android view and Audi infotainment functions, as desired.
Audi is a long-standing customer of QNX Software Systems. Audi systems based on QNX technology include the recent Audi Virtual Cockpit and Audi Connect with Google Earth.
Audi plans to introduce Android Auto support in all-new models launched in 2015. For the complete story on Audi support for Android Auto, read the Audi press release.
The new feature will allow drivers to access Android-device car apps using Audi MMI displays and controls, which Audi has optimized for safe and intuitive operation on the road.
Audi states that the MMI system will still maintain its compatibility with other smartphones. Moreover, drivers will be able to switch between the Android view and Audi infotainment functions, as desired.
Audi is a long-standing customer of QNX Software Systems. Audi systems based on QNX technology include the recent Audi Virtual Cockpit and Audi Connect with Google Earth.
Audi plans to introduce Android Auto support in all-new models launched in 2015. For the complete story on Audi support for Android Auto, read the Audi press release.
QNX reference vehicle makes stopover at FTF Americas 2012
Fresh off Telematics Detroit, the QNX reference vehicle is on the road again. And this time, it’s headed to the Freescale Technology Forum (FTF) in San Antonio.
Have you seen photos of the vehicle? If so, you'll know it's a specially modified Jeep Wrangler. From the outside, the Jeep stills looks the same, but beneath the hood, something has changed. For the first time, the Jeep’s head unit and instrument cluster, both based on the QNX CAR 2 application platform, are using Freescale i.MX 6 processors. And what better place than FTF to show off this new processor support?
Closeup of Jeep's instrument cluster. See previous post for more photos of vehicle.
As before, the reference vehicle will showcase several capabilities of the QNX CAR 2 platform, including:
The vehicle will also demonstrate several popular third-party technologies, including Pandora, Slacker, and TuneIn Internet radio; TCS navigation; Weather Network; Best Parking; and Vlingo/AT&T Watson voice recognition.
What, more demos?
The reference vehicle isn't the only place to catch QNX technology at FTF. QNX will also showcase:
QNX at the podium
Did I mention? QNX experts will also in participate in several presentations and panels. Here's the quick schedule:
Visit the FTF website for details on these and other FTF presentations.
And if you're at FTF, remember to catch the QNX demos at pod numbers 1400 to 1405.
Have you seen photos of the vehicle? If so, you'll know it's a specially modified Jeep Wrangler. From the outside, the Jeep stills looks the same, but beneath the hood, something has changed. For the first time, the Jeep’s head unit and instrument cluster, both based on the QNX CAR 2 application platform, are using Freescale i.MX 6 processors. And what better place than FTF to show off this new processor support?

As before, the reference vehicle will showcase several capabilities of the QNX CAR 2 platform, including:
- auto-centric HTML5 framework
- integration with a variety of popular smartphones
- one-touch Bluetooth pairing with smartphones using NFC
- ultra HD hands-free communication
- DLNA support for phone- and home- based media
- tablet-based rear-seat entertainment
- reconfigurable digital instrument cluster
- Wi-Fi hotspot
The vehicle will also demonstrate several popular third-party technologies, including Pandora, Slacker, and TuneIn Internet radio; TCS navigation; Weather Network; Best Parking; and Vlingo/AT&T Watson voice recognition.
What, more demos?
The reference vehicle isn't the only place to catch QNX technology at FTF. QNX will also showcase:
- a 3D digital instrument cluster based on a Freescale i.MX 6 quad processor and the QNX Neutrino RTOS, and built with Elektrobit's EB GUIDE Human Machine Interface environment
- a complete dashboard, including head unit and digital cluster, based on the QNX CAR 2 platform
- demos for industrial controllers, medical devices, multi-core systems, and advanced graphics, all of which run on the QNX Neutrino RTOS and Freescale silicon
QNX at the podium
Did I mention? QNX experts will also in participate in several presentations and panels. Here's the quick schedule:
- The HTML5 Effect: How HTML5 will Change the Networked Car — June 19, 2:00 pm, Grand Oaks Ballroom A
- Using an IEC 61508-Certified RTOS Kernel for Safety-Critical Systems — June 20, 2:00 pm, Grand Oaks Ballroom P
- Embedded Meets Mobility: M2M Considerations and Concepts — June 20, 5:15 pm, Grand Oaks Ballroom E
- New System Design for Multicore Processors — June 21, 10:30 am, Grand Oaks Ballroom F
Visit the FTF website for details on these and other FTF presentations.
And if you're at FTF, remember to catch the QNX demos at pod numbers 1400 to 1405.
Concept car out. Reference vehicle in.
Our big announcement for Telematics Update is that we are not showing a concept car. Odd news, you say. The truth is, we're not building a concept car because we are building a reference vehicle. Splitting hairs? Not really.
Unlike the Corvette and the Porsche, our demo for this show will be based on the exact same technology that our customers are using today to design their next-generation systems.
So why vehicle instead of car? Is it a truck? Nope. A van? Negative. What about a motorcycle? Double negative.
I was hoping to give you a sneak peek at what we are working on but I'm not allowed to give away the details. However, I did manage to get these shots – let me know if you can see the vehicle. :-)

Unlike the Corvette and the Porsche, our demo for this show will be based on the exact same technology that our customers are using today to design their next-generation systems.
So why vehicle instead of car? Is it a truck? Nope. A van? Negative. What about a motorcycle? Double negative.
I was hoping to give you a sneak peek at what we are working on but I'm not allowed to give away the details. However, I did manage to get these shots – let me know if you can see the vehicle. :-)

A cool surprise at the Elektrobit auto summit
Recently, our good friends at Elektrobit invited the QNX team to participate in their German Executive Automotive Summit. It was an outstanding event with all of the leading OEMs and tier ones represented. The day was filled with engaging speakers and plenty of opportunities to network.
Elektrobit held the event in a small castle near Erlangen. In the courtyard, several cars featuring Elektrobit technology (and, in almost all cases, QNX technology) were on display. The car from Delphi was especially interesting. It's a full-blown race car, complete with everything you'd expect in a track vehicle — but it also has two rear seats. These seats allow mere mortals like you and I to vicariously share the racing experience with a professional driver at the wheel.

As I stood next to it, drooling, I noticed that it was equipped with an infotainment system, mounted on the back of the driver's seat. I leaned in to have a closer look and, to my delight, saw that it was running the QNX OS. Who knew?
Elektrobit held the event in a small castle near Erlangen. In the courtyard, several cars featuring Elektrobit technology (and, in almost all cases, QNX technology) were on display. The car from Delphi was especially interesting. It's a full-blown race car, complete with everything you'd expect in a track vehicle — but it also has two rear seats. These seats allow mere mortals like you and I to vicariously share the racing experience with a professional driver at the wheel.

As I stood next to it, drooling, I noticed that it was equipped with an infotainment system, mounted on the back of the driver's seat. I leaned in to have a closer look and, to my delight, saw that it was running the QNX OS. Who knew?

Making of the QNX concept car... honest
QNX helps drive new autonomous vehicle project
Have I ever mentioned the QNX-in-Education program? Over the decades, it has supported an array of university research projects, in fields ranging from humanoid robotics to autonomous aircraft. Harvard University, for example, has been a program member for more than 20 years, using QNX technology to measure and analyze ozone depletion in the stratosphere.
So, on the one hand, QNX Software Systems supports scientific and engineering research. On the other hand, it's a leader in automotive software. You know what that means: it was only a matter of time before those two passions came together. And in fact, QNX has just announced its role in DEEVA, a new autonomous car project from the Artificial Vision and Intelligent Systems Laboratory (VisLab) of the University of Parma.
The folks at VisLab already have several autonomous projects under the belts. Last year, for example, they launched a self-driving car that can negotiate downtown rush-hour traffic and complex situations like traffic circles, traffic lights, and pedestrian crossings. DEEVA incorporates the team's latest insights into autonomous drive and features a rich set of sensors that deliver a complete 3D view of the circumference of the vehicle.
With its 30-year history in safety-critical systems, QNX OS technology offers a natural choice for a project like DEEVA. According to Professor Alberto Broggi, president and CEO of VisLab, "in the design of our vehicle, we selected building blocks offering high reliability with proven safety records; the operating system powering the vital elements of the vehicle is one of those and is why we chose the QNX OS.”
The QNX OS controls several systems in DEEVA, including path and trajectory planning, realtime fusion of laser data and visual data, and the user interface.
You can read the press release here and see photos of DEEVA here
So, on the one hand, QNX Software Systems supports scientific and engineering research. On the other hand, it's a leader in automotive software. You know what that means: it was only a matter of time before those two passions came together. And in fact, QNX has just announced its role in DEEVA, a new autonomous car project from the Artificial Vision and Intelligent Systems Laboratory (VisLab) of the University of Parma.
![]() |
A glimpse of DEEVA (Source VisLab). |
The folks at VisLab already have several autonomous projects under the belts. Last year, for example, they launched a self-driving car that can negotiate downtown rush-hour traffic and complex situations like traffic circles, traffic lights, and pedestrian crossings. DEEVA incorporates the team's latest insights into autonomous drive and features a rich set of sensors that deliver a complete 3D view of the circumference of the vehicle.
With its 30-year history in safety-critical systems, QNX OS technology offers a natural choice for a project like DEEVA. According to Professor Alberto Broggi, president and CEO of VisLab, "in the design of our vehicle, we selected building blocks offering high reliability with proven safety records; the operating system powering the vital elements of the vehicle is one of those and is why we chose the QNX OS.”
The QNX OS controls several systems in DEEVA, including path and trajectory planning, realtime fusion of laser data and visual data, and the user interface.
You can read the press release here and see photos of DEEVA here
QNX-powered Audi Virtual Cockpit shortlisted for MWC’s Global Mobile Awards
By Lynn Gayowski
2015 has just started and the QNX auto team is already off to the races. It was only last month at CES that the digital mirrors in our 2015 technology concept car were selected as a finalist for Engadget’s Best of CES Awards, in the category for best automotive tech. Now we’re excited to share some other big, award-related news. Drum roll, please… the QNX-powered Audi virtual cockpit in the 2015 Audi TT has been shortlisted for Mobile World Congress’ prestigious Global Mobile Awards, in the category for best mobile innovation for automotive!
The 2015 Audi TT features a one-of-a-kind, innovative, and just plain awesome, instrument cluster — the Audi virtual cockpit — powered by the QNX operating system. With the Audi virtual cockpit, everything is in view, directly in front of the driver. All the functions of a conventional instrument cluster and a center-mounted head unit are blended into a single, highly convenient, 12.3" display. This approach allows users to interact with their music, navigation, and vehicle information in a simple, streamlined fashion. As you may recall, the QNX-powered Audi virtual cockpit also took home first place in CTIA’s Hot for the Holidays Awards late last year.
Props also to our BlackBerry colleagues, who received 2 nominations themselves for the Global Mobile Awards: BlackBerry Blend in the best mobile service or app for consumers category, and Blackberry for BBM Protected in the best security/anti-fraud product or solution category.
The winners will be announced on March 3 at the Global Mobile Awards ceremony at Mobile World Congress. We can’t wait to hit Barcelona! In the meantime, check out the video below to see the Audi virtual cockpit in action.
2015 has just started and the QNX auto team is already off to the races. It was only last month at CES that the digital mirrors in our 2015 technology concept car were selected as a finalist for Engadget’s Best of CES Awards, in the category for best automotive tech. Now we’re excited to share some other big, award-related news. Drum roll, please… the QNX-powered Audi virtual cockpit in the 2015 Audi TT has been shortlisted for Mobile World Congress’ prestigious Global Mobile Awards, in the category for best mobile innovation for automotive!
The 2015 Audi TT features a one-of-a-kind, innovative, and just plain awesome, instrument cluster — the Audi virtual cockpit — powered by the QNX operating system. With the Audi virtual cockpit, everything is in view, directly in front of the driver. All the functions of a conventional instrument cluster and a center-mounted head unit are blended into a single, highly convenient, 12.3" display. This approach allows users to interact with their music, navigation, and vehicle information in a simple, streamlined fashion. As you may recall, the QNX-powered Audi virtual cockpit also took home first place in CTIA’s Hot for the Holidays Awards late last year.
Props also to our BlackBerry colleagues, who received 2 nominations themselves for the Global Mobile Awards: BlackBerry Blend in the best mobile service or app for consumers category, and Blackberry for BBM Protected in the best security/anti-fraud product or solution category.
The winners will be announced on March 3 at the Global Mobile Awards ceremony at Mobile World Congress. We can’t wait to hit Barcelona! In the meantime, check out the video below to see the Audi virtual cockpit in action.
Now with ADAS: The revamped QNX reference vehicle
![]() |
Tina Jeffrey |
I walked into the QNX garage a few weeks ago and did a double take. The QNX reference vehicle, a modified Jeep Wrangler, had undergone a major overhaul both inside and out — and just in time for 2015 CES.
Before I get into the how and why of the Jeep’s metamorphosis, here’s a glimpse of its newly refreshed exterior. Orange is the new gray!

The Jeep debuted in June 2012 at Telematics Detroit. Its purpose: to show how customers can use off-the-shelf QNX products, like the QNX CAR Platform for Infotainment and QNX OS, to build a wide range of custom infotainment systems and instrument clusters, using a single code base.
From day one, the Jeep has been a real workhorse, making appearances at numerous events to showcase the latest HMI, navigation, speech recognition, multimedia, and handsfree acoustics technologies, not to mention embedded apps for parking, internet radio streaming, weather, and smartphone connectivity. The Jeep has performed dependably time and time again, and now, in an era where automotive safety is top of mind, we’ve decided to up the ante and add leading-edge ADAS technology built on the QNX OS.
After all, what sets the QNX OS apart is its proven track record in safety-certified systems across market segments — industrial, medical, and automotive. In fact, the QNX OS for Automotive Safety is certified to the highest level of automotive functional safety: ISO 26262, ASIL D. Using a pre-certified OS component is key to the overall integrity of an automotive system and makes system certification much easier.
The ultimate (virtual) driving experience
How better to showcase ADAS in the Jeep, than by a virtual drive? At CES, a 12-foot video screen in front of the Jeep plays a pre-recorded driving scene, while the onboard ADAS system analyzes the scene to detect lane markers, speed signs, and preceding vehicles, and to warn of unintentional lane departures, excessive speed, and imminent crashes with vehicles on the road ahead. Onboard computer vision algorithms from Itseez process the image frames in real time to perform these functions simultaneously.
Here’s a scene from the virtual drive, in which the ADAS system is tracking lane markings and has detected a speed-limit sign:

If the vehicle begins to drift outside a lane, the steering wheel provides haptic feedback and the cluster displays a warning:

The ADAS system includes EB Assist eHorizon, which uses map data with curve-speed information to provide warnings and recommendations, such as reducing your speed to navigate an upcoming curve:

The Jeep also has a LiDAR system from Phantom Intelligence (formerly Aerostar) to detect obstacles on the road ahead. The cluster displays warnings from this system, as well as warnings from the vision-based collision-detection feature. For example:

POSTSCRIPT:
Here’s a short video of the virtual drive, taken at CES by Brandon Lewis of Embedded Computing Design, in which you can see curve-speed warnings and lane-departure warnings:
Fast-boot camera
Rounding out the ADAS features is a rear-view camera demo that can cold boot in 0.8 seconds on a Texas Instruments Jacinto 6 processor. As you may recall, NHTSA has mandated that, by May 2018, most new vehicles must have rear-view technology that can display a 10-by-20 foot area directly behind the vehicle; moreover, the display must appear no more than 2 seconds after the driver throws the vehicle into reverse. Backup camera and other fastboot requirements such as time-to-last-mode audio, time-to-HMI visible, and time-to-fully-responsive HMI are critically important to automakers. Be sure to check out the demo — but don’t blink or you’ll miss it!
Full-featured infotainment
The head unit includes a full-featured infotainment system based on the QNX CAR Platform for Infotainment and provides information such as weather, current song, and turn-by-turn directions to the instrument cluster, where they’re easier for the driver to see.

Infotainment features include:
Qt-based HMI — Can integrate other HMI technologies, including EB Guide and Crank Storyboard.
Natural language processing (NLP) — Uses Nuance’s Vocon Hybrid solution in concert with the QNX NLP technology for natural interaction with infotainment functions. For instance, if you ask “Will I need a jacket later today?”, the Weather Network app will launch and provide the forecast.
EB street director — Provides embedded navigation with a 3D map engine; the map is synched up with the virtual drive during the demo.
QNX CAR Platform multimedia engine — An automotive-hardened solution that can handle:
- audio management for seamless transitions between all audio sources
- media detection and browsing of connected devices
- background synching of music for instant media playback — without the need for the synch to be completed
Support for all smartphone connectivity options — DLNA, MTP, MirrorLink, Bluetooth, USB, Wi-Fi, etc.
On-board application framework — Supports Qt, HTML5, APK (for Android apps), and native OpenGL ES apps. Apps include iHeart, Parkopedia, Pandora, Slacker, and Weather Network, as well as a Settings app for phone pairing, over-the-air software updates, and Wi-Fi hotspot setup.
So if you’re in the North Hall at CES this week, be sure to take a virtual ride in the QNX reference vehicle in Booth 2231. Beneath the fresh paint job, it’s the same workhorse it has always been, but now with new ADAS tech automakers are thirsting for.
Subscribe to:
Posts (Atom)