HTML5 appears to have a number of benefits for consumers and car manufacturers. But what is often good for the goose is not necessarily good for the developer. Talking to the guys in the trenches is critical to understanding the true viability of HTML5.
Andy Gryc and Sheridan Ethier, manager of the automotive development team at QNX, pair up for a technical discussion on HTML5. They explore whether this new technology can support rich user interfaces, how HTML5 apps can be blended with apps written in OpenGL, and if interprocess communication can be implemented between native and web-based applications.
So without further ado, here’s the latest in the educational series of HTML5 videos from QNX.
This interview of Sheridan Ethier is the third in a series from QNX on HTML5.
Showing posts with label Mobile connectivity. Show all posts
Showing posts with label Mobile connectivity. Show all posts
Is HTML5 a good gamble?
As the consumer and automotive worlds continue to collide, HTML5 looks like a good bet. And not a long shot either. In fact, the odds are all automakers will eventually use it. But since the standard won’t be mature for some time yet, should you take a chance on it now?
To answer this, Andy Gryc talks to Matthew Staikos of RIM. Matthew is the manager of the browser and web platform group at RIM, and has over 10 years of software development experience with a strong focus on WebKit for mobile and embedded systems. Matthew co-founded Torch Mobile, which was acquired by RIM for their browser technology.
Andy’s conversation with Matthew is the subject of the following video, the second in an educational series designed to get an industry-wide perspective on HTML5.
This interview of Matthew Staikos is the second in a series from QNX on HTML5.
The ultimate show-me car
The fifth installment in the CES Cars of Fame series. Our inductee for this week: a most bodacious Bentley.
It's one thing to say you can do something. It's another thing to prove it. Which helps explain why we create technology concept cars.
You see, we like to tell people that flexibility and customization form the very DNA of the QNX CAR Platform for Infotainment. Which they do. But in the automotive world, people don't just say "tell me"; they say "show me". And so, we used the platform to transform a Bentley Continental GT into a unique concept car, equipped with features never before seen in a vehicle.
Now here's the thing. This is the same QNX CAR Platform found in the QNX reference vehicle, which I discussed last week. But when you compare the infotainment systems in the two vehicles, the differences are dramatic: different features, different branding, different look-and-feel.
The explanation is simple: The reference vehicle shows what the QNX CAR Platform can do out of the box, while the Bentley demonstrates what the platform lets you do once you add your imagination to mix. One platform, many possibilities.
Enough talk; time to look at the car. And let's start with the exterior, because wow:

The awesome (and full HD) center stack
And now let's move to the interior, where the first thing you see is a gorgeous center stack. This immense touchscreen features a gracefully curved surface, full HD graphics, and TI’s optical touch input technology, which allows a physical control knob to be mounted directly on the screen — a feature that’s cool and useful. The center stack supports a variety of applications, including a 3D navigation system from Elektrobit that makes full use of the display:

At 17 inches, the display is big enough to display other functions, such as the car’s media player or virtual mechanic, and still have plenty of room for navigation:

The awesome (and very configurable) digital instrument cluster
The instrument cluster is implemented entirely in software, though you would hardly know it — the virtual gauges are impressively realistic. More impressive still is the cluster’s ability to morph itself on the fly. Put the car in Drive, and the cluster will display a tach, gas gauge, temperature gauge, and turn-by-turn directions — the cluster pulls these directions from the center stack’s navigation system. Put the car in Reverse, and the cluster will display a video feed from the car’s backup camera. You can also have the cluster display the current weather and current sound track:

The awesome (and just plain fun) web app
The web app works with any web browser and allows the driver to view data that the car publishes to the cloud, such as fluid levels, tire pressure, brake wear, and the current track being played by the infotainment system. It even allows the driver to remotely start or stop the engine, open or close windows, and so on:

The awesome (and nicely integrated) smartphone support
The Bentley also showcases how the QNX CAR Platform enables advanced integration with popular smartphones. For instance, the car can communicate with a smartphone to stream music, or to provide notifications of incoming email, news feeds, and other real-time information — all displayed in a manner appropriate to the automotive context. Here's an example:

The awesome everything else
I’ve only scratched the surface of what the car can do. For instance, it also provides:
Moving pictures
Okay, time for some video. Here's a fun look at the making of the car:
And here's a run-through of the car's many capabilities, filmed by our friends at TI during 2013 CES:

You see, we like to tell people that flexibility and customization form the very DNA of the QNX CAR Platform for Infotainment. Which they do. But in the automotive world, people don't just say "tell me"; they say "show me". And so, we used the platform to transform a Bentley Continental GT into a unique concept car, equipped with features never before seen in a vehicle.
Now here's the thing. This is the same QNX CAR Platform found in the QNX reference vehicle, which I discussed last week. But when you compare the infotainment systems in the two vehicles, the differences are dramatic: different features, different branding, different look-and-feel.
The explanation is simple: The reference vehicle shows what the QNX CAR Platform can do out of the box, while the Bentley demonstrates what the platform lets you do once you add your imagination to mix. One platform, many possibilities.
Enough talk; time to look at the car. And let's start with the exterior, because wow:

The awesome (and full HD) center stack
And now let's move to the interior, where the first thing you see is a gorgeous center stack. This immense touchscreen features a gracefully curved surface, full HD graphics, and TI’s optical touch input technology, which allows a physical control knob to be mounted directly on the screen — a feature that’s cool and useful. The center stack supports a variety of applications, including a 3D navigation system from Elektrobit that makes full use of the display:

At 17 inches, the display is big enough to display other functions, such as the car’s media player or virtual mechanic, and still have plenty of room for navigation:

The awesome (and very configurable) digital instrument cluster
The instrument cluster is implemented entirely in software, though you would hardly know it — the virtual gauges are impressively realistic. More impressive still is the cluster’s ability to morph itself on the fly. Put the car in Drive, and the cluster will display a tach, gas gauge, temperature gauge, and turn-by-turn directions — the cluster pulls these directions from the center stack’s navigation system. Put the car in Reverse, and the cluster will display a video feed from the car’s backup camera. You can also have the cluster display the current weather and current sound track:

The awesome (and just plain fun) web app
The web app works with any web browser and allows the driver to view data that the car publishes to the cloud, such as fluid levels, tire pressure, brake wear, and the current track being played by the infotainment system. It even allows the driver to remotely start or stop the engine, open or close windows, and so on:

The awesome (and nicely integrated) smartphone support
The Bentley also showcases how the QNX CAR Platform enables advanced integration with popular smartphones. For instance, the car can communicate with a smartphone to stream music, or to provide notifications of incoming email, news feeds, and other real-time information — all displayed in a manner appropriate to the automotive context. Here's an example:

The awesome everything else
I’ve only scratched the surface of what the car can do. For instance, it also provides:
- Advanced voice rec — Just say “Hello Bentley,” and the car’s voice recognition system immediately comes to life and begins to interact with you — in a British accent, of course.
- Advanced multimedia system — Includes support for Internet radio.
- Video conferencing with realistic telepresence — Separate cameras for the driver and passenger provide independent video streams, while fullband voice technology from QNX offers expanded bandwidth for greater telepresence.
- LTE connectivity — The car features an LTE radio modem, as well as a Wi-Fi hotspot for devices you bring into the car.
Moving pictures
Okay, time for some video. Here's a fun look at the making of the car:
And here's a run-through of the car's many capabilities, filmed by our friends at TI during 2013 CES:
What’s HTML5 got to do with automotive?
There’s been a lot of noise lately about HTML5. A September 2011 report by binvisions shows that search engines and social media web sites are leading the way toward adoption: Google, Facebook, YouTube, Wikipedia, Twitter, and plenty more have already transitioned to HTML5. Some are taking it even further: Facebook has an HTML5 Resource Center for developers and the Financial Times has a mobile HTML5 version of their website.
It won’t be long before HTML5 is ubiquitous. We think automakers should (and will) use it.
To elucidate the technology and its relevance, we’ve created a series of educational videos on the topic. Here is the first in that series. Interviews with partners, customers, and industry gurus will soon follow.
This simple overview is the first in a series from QNX on HTML5. (Personally I like the ending the best.)
Seamless connectivity is for more than online junkies
As much as I’m not always enamored with sitting behind a computer all day, I find being off the grid annoying. Remember this email joke?
Even though this joke circulated several years ago, it still strikes a chord. The big difference now is that there’s no longer a subculture of ‘online junkies.’ From the time we wake up in the morning to the time we go to bed, we all want to be connected — and that includes when we get behind the wheel. So to this joke I would add:
At QNX, we’re working toward a seamless experience where people can enjoy the same connectivity whether they’re texting their spouse from the mall or checking traffic reports while driving down the highway. See what I mean:
For more information about the technology described in this video, visit the QNX website.
- You know you’re an online junkie when you:
- wake up at 3:00 am to go to the bathroom and stop to check your email on the way back to bed
- rarely communicate with your mother because she doesn’t have email
- check your inbox. It says ‘no new messages,’ so you check it again
Even though this joke circulated several years ago, it still strikes a chord. The big difference now is that there’s no longer a subculture of ‘online junkies.’ From the time we wake up in the morning to the time we go to bed, we all want to be connected — and that includes when we get behind the wheel. So to this joke I would add:
- resent driving because it means going off the grid
At QNX, we’re working toward a seamless experience where people can enjoy the same connectivity whether they’re texting their spouse from the mall or checking traffic reports while driving down the highway. See what I mean:
For more information about the technology described in this video, visit the QNX website.
What if…
Imagine if your car could help you become more connected to friends and family — and to the road ahead. Enter a new video that peers into the not-so-distant future.
It blows my mind, but some people still see connectivity in the car as the enemy. They think that, the more connected the car, the more distracting and dangerous it will be. But you know what? Responding to their concerns is easy. I simply ask them what if.
For instance, what if connectivity helped you drive with greater situational awareness? What if it helped you sidestep traffic jams and axle-busting pot holes? What if it helped you detect a stop sign hidden behind a tree? And what if it helped you become more connected to the people important to you, as well as to the road and the cars around you?
When we talk connectivity at QNX, that’s the kind of connectivity we envision. It isn’t just about Bluetooth or Wi-Fi or LTE — that’s only the plumbing. Rather, it’s about keeping you in tune and in sync with your car, your environment, your business, your friends. Your life.
It blows my mind, but some people still see connectivity in the car as the enemy. They think that, the more connected the car, the more distracting and dangerous it will be. But you know what? Responding to their concerns is easy. I simply ask them what if.
For instance, what if connectivity helped you drive with greater situational awareness? What if it helped you sidestep traffic jams and axle-busting pot holes? What if it helped you detect a stop sign hidden behind a tree? And what if it helped you become more connected to the people important to you, as well as to the road and the cars around you?
When we talk connectivity at QNX, that’s the kind of connectivity we envision. It isn’t just about Bluetooth or Wi-Fi or LTE — that’s only the plumbing. Rather, it’s about keeping you in tune and in sync with your car, your environment, your business, your friends. Your life.
Can HTML5 keep car infotainment on track?
![]() |
Paul Leroux |
The rail industry realized long ago that, unless it settled on a standard, costly scenarios like this would repeat themselves ad infinitum. As a result, some 60% of railways worldwide, including those in China, now use standard gauge, ensuring greater interoperability and efficiency.

Of course, there are existing solutions for addressing these issues. But that's the problem: multiple solutions, and no accepted standard. And without one, how will cars and mobile devices ever leverage one another out of the box, without a lot of workarounds? And how will automakers ever tap into a (really) large developer community?
No standard means more market fragmentation — and more fragmentation means less efficiency, less interoperability, and less progress overall. Who wants that?
Is HTML5, which is already transforming app development in the desktop, server, and mobile worlds, the standard the car infotainment industry needs? That is one of the questions my colleague, Andy Gryc, will address in his seminar, HTML5 for automotive infotainment: What, why, and how?. The webinar happens tomorrow, November 15. I invite you to check it out.
RealVNC, QNX team up for mobile-to-vehicle connectivity
![]() |
Paul Leroux |
With RealVNC’s MirrorLink-certified SDK integrated in the QNX CAR Platform, QNX can offer a variety of connectivity features for integrating cars and smartphones through Wi-Fi, Bluetooth, and USB.
“We are delighted to work with QNX on integrating VNC Automotive into the QNX CAR Platform... many tier 1 and auto OEM customers are already using the proven combination of RealVNC and QNX technologies in production programs,” said Tom Blackie, VP Mobile RealVNC.
Read the full press on the QNX website.
Top 8 questions for squeezing high-end tech into low-end infotainment
A couple of weeks ago, I hosted a webinar that addressed the question, “Is it possible to build an infotainment system that meets today's customer demands with yesterday's price tag?” The webinar explored several ways to reduce RAM and ROM requirements, eliminate hardware, and share hardware, all with the goal of cutting BOM costs.
As always, the audience asked lots of great questions, several of which I have answered here. Of course, these provide only a hint of the ground covered in the webinar, so I invite you to download the archived version to get the full details.
Built-in phone module versus brought-in smart phone: what is your take on this, and is a hybrid approach feasible?
The approach will vary from automaker to automaker. I think that embedded phones will be required for certain cars, especially if they use systems that rely on cloud-based services. This approach adds to the BOM cost, of course, but it may reduce overall cost, depending on what features can be off-loaded to the cloud.
Some brands will encourage brought-in devices as the lowest-cost alternative. The consumer will then have to deal with the setup and maintenance issues required to pair or charge the phone. I don’t see a clear-cut analysis that says one method will be better than the other — it really depends on what you want to accomplish.
Any thoughts on using MirrorLink to clone a virtual display to a remote physical LCD?
If you’re talking about a remote (as in cloud-based) device, I would say that HTML5 is a much more natural choice for a server-based application. If it’s a brought-in device, then MirrorLink or HTML5 could be appropriate.
If GPS is moved to a brought-in phone, how will a stolen vehicle be located?
It won’t be, unless the phone was left in the vehicle. This is one of the trade-offs you make when trying to reduce cost.
Of the cost-saving techniques you discussed, which are most likely to be used?
Already, some system designers are removing wake-up micros and DSPs. I’m not aware of any systems where the LCD has been removed, but this approach would probably offer the largest cost savings, making it a likely choice for entry-level systems and cost-sensitive markets.
Security and reliability are the main concerns of a head unit. Squeezing high-end technologies into low-end systems won’t relax those expectations. For instance, smart phone integration will be an add-on instead of replacing functionality of the head unit. Thoughts?
The trade-offs will need to be communicated to the customer. You can never build a head-unit augmented with a smartphone that works as reliably as the head unit operating by itself. As an OEM or Tier 1, you just don’t have enough control over the brought-in devices.
As an industry, we need to educate consumers. If they start relying on the phone in the car to provide certain features, then they will have to expect an inevitable degradation in overall system quality. It comes back to that famous adage: “cost, quality, or time — pick two”.
MirrorLink has a defined communication interface to the head unit. You also mentioned HTML5 as an option. Is there a defined standard yet for transmitting the HTML5 up to the head unit?
The interface between web server (i.e. phone) and web client (i.e. head unit) is already well established and tested. For some specialized features (for instance, allowing HTML5 code to access vehicle services) some standardization may be required. This will hopefully be a topic of discussion in November, at the automotive HTML5 workshop hosted by the W3C in Rome. Some Connected Car Consortium members have also discussed the possibility that, in the future, MirrorLink could add a transport mechanism based on HTML5.
You discussed peripheral sharing, using QNX transparent distributed processing. Does QNX TDP require secure authentication between distributed boxes?
No, it does not. It relies on standard POSIX user group permissions to provide access rights to devices.
Can you discuss any trends you see regarding Ethernet or TCP/IP in the vehicle?
Ethernet is definitely becoming more interesting in the vehicle, due to the introduction of Ethernet AVB. It makes a very natural replacement for audio-video transmission over MOST, and the additions to AVB that fulfil strict timing requirements can replace CAN or MOST for non-media vehicle messages. Ethernet also has obvious advantages when you need to access Wi-Fi networks, cloud services, and mobile devices.
As always, the audience asked lots of great questions, several of which I have answered here. Of course, these provide only a hint of the ground covered in the webinar, so I invite you to download the archived version to get the full details.
Built-in phone module versus brought-in smart phone: what is your take on this, and is a hybrid approach feasible?
The approach will vary from automaker to automaker. I think that embedded phones will be required for certain cars, especially if they use systems that rely on cloud-based services. This approach adds to the BOM cost, of course, but it may reduce overall cost, depending on what features can be off-loaded to the cloud.
Some brands will encourage brought-in devices as the lowest-cost alternative. The consumer will then have to deal with the setup and maintenance issues required to pair or charge the phone. I don’t see a clear-cut analysis that says one method will be better than the other — it really depends on what you want to accomplish.
Any thoughts on using MirrorLink to clone a virtual display to a remote physical LCD?
If you’re talking about a remote (as in cloud-based) device, I would say that HTML5 is a much more natural choice for a server-based application. If it’s a brought-in device, then MirrorLink or HTML5 could be appropriate.
If GPS is moved to a brought-in phone, how will a stolen vehicle be located?
It won’t be, unless the phone was left in the vehicle. This is one of the trade-offs you make when trying to reduce cost.
Of the cost-saving techniques you discussed, which are most likely to be used?
Already, some system designers are removing wake-up micros and DSPs. I’m not aware of any systems where the LCD has been removed, but this approach would probably offer the largest cost savings, making it a likely choice for entry-level systems and cost-sensitive markets.
Security and reliability are the main concerns of a head unit. Squeezing high-end technologies into low-end systems won’t relax those expectations. For instance, smart phone integration will be an add-on instead of replacing functionality of the head unit. Thoughts?
The trade-offs will need to be communicated to the customer. You can never build a head-unit augmented with a smartphone that works as reliably as the head unit operating by itself. As an OEM or Tier 1, you just don’t have enough control over the brought-in devices.
As an industry, we need to educate consumers. If they start relying on the phone in the car to provide certain features, then they will have to expect an inevitable degradation in overall system quality. It comes back to that famous adage: “cost, quality, or time — pick two”.
MirrorLink has a defined communication interface to the head unit. You also mentioned HTML5 as an option. Is there a defined standard yet for transmitting the HTML5 up to the head unit?
The interface between web server (i.e. phone) and web client (i.e. head unit) is already well established and tested. For some specialized features (for instance, allowing HTML5 code to access vehicle services) some standardization may be required. This will hopefully be a topic of discussion in November, at the automotive HTML5 workshop hosted by the W3C in Rome. Some Connected Car Consortium members have also discussed the possibility that, in the future, MirrorLink could add a transport mechanism based on HTML5.
You discussed peripheral sharing, using QNX transparent distributed processing. Does QNX TDP require secure authentication between distributed boxes?
No, it does not. It relies on standard POSIX user group permissions to provide access rights to devices.
Can you discuss any trends you see regarding Ethernet or TCP/IP in the vehicle?
Ethernet is definitely becoming more interesting in the vehicle, due to the introduction of Ethernet AVB. It makes a very natural replacement for audio-video transmission over MOST, and the additions to AVB that fulfil strict timing requirements can replace CAN or MOST for non-media vehicle messages. Ethernet also has obvious advantages when you need to access Wi-Fi networks, cloud services, and mobile devices.
QNX and Freescale talk future of in-car infotainment
![]() |
Paul Leroux |
If you've read any of my blog posts on the QNX concept car (see here, here, and here), you've seen an example of how mixing QNX and Freescale technologies can yield some very cool results.
So it's no surprise that when Jennifer Hesse of Embedded Computing Design wanted to publish an article on the challenges of in-car infotainment, she approached both companies. The resulting interview, which features Andy Gryc of QNX and Paul Sykes of Freescale, runs the gamut — from mobile-device integration and multicore processors to graphical user interfaces and upgradeable architectures. You can read it here.
HTML5 SDK for the QNX CAR 2 platform — the back story
![]() |
Kerry Johnson |
Enabling apps for the car
Almost every consumer who owns a smart phone or tablet is familiar with the app experience: you go to an online marketplace, find apps of interest, and download them onto your device. With the HTML5 SDK, the automotive team at QNX is creating an analogous experience for the car.
Just as Apple, Android, and RIM provide SDKs to help vendors develop apps for their mobile platforms, QNX has created an SDK to help vendors to build apps for the QNX CAR 2 application platform. The closest analogies you will find to our HTML5 SDK are Apache Cordova and PhoneGap, both of which provide tools for creating mobile apps based on HTML5, CSS, JavaScript, and other web technologies.
App developers want to see the largest possible market for their apps. To that end, QNX also announced today that it will participate in the W3C’s Web and Automotive Workshop. The workshop aims to achieve industry alignment on how HTML5 is used in the car and to find common interfaces to reduce platform fragmentation from one automaker to the next. Obviously, app developers would like to see a common auto platform, while automakers want to maintain their differentiation. Thus, we believe the common ground achieved through W3C standardization will be important.
It bears mentioning that, unlike phone and tablet apps, car apps must offer a user experience that takes driver safety into consideration. This is a key issue, but beyond the scope of this post, so I won’t dwell on it here.
So what’s in the SDK, anyway?
As in any SDK, app developers will find tools to build and debug applications, and APIs that provide access the underlying platform. Specifically, the SDK will include:
- APIs to access vehicle resources, such as climate control, radio, navigation, and media playback
- APIs to manage the application life cycle: start, stop, show, hide, etc.
- APIs to discover and launch other applications
- A packaging tool to combine application code (HTML, CSS, JavaScript) and UI resources (icons, images, etc.) with QNX CAR APIs to create an installable application – a .bar file
- A emulator for the QNX CAR 2 platform to test HTML5 applications
- Oh yeah, and documentation and examples
The development and deployment flow looks something like this:

Emulator and debugging environment
The QNX automotive team has extended the Ripple emulator environment to work with the QNX CAR 2 application platform. Ripple is an emulation environment originally designed for BlackBerry smart phones that RIM has open sourced on github.
Using this extended emulator, application developers can test their applications with the correct screen resolution and layout, and watch how their application interacts with the QNX CAR 2 platform APIs. For example, consider an application that controls audio in a car: balance, fade, bass, treble, volume, and so on. The screenshot below shows the QNX CAR 2 screen for controlling these settings in the Ripple emulator.

Using the Ripple emulator to test an audio application. Click to magnify.
In this example, you can use the onscreen controls to adjust volume, bass, treble, fade, and balance; you can also observe the changes to the underlying data values in the right-hand panel. And you can work the other way: by changing the controls on the right, you can observe changes to the on-screen display. The Ripple interface supports many other QNX CAR 2 features; for examples, see the QNX Flickr page.
You can also use the emulator in conjunction with the Web Inspector debugger to do full source-code debugging of your Javascript code.
Creating native services
Anyone who has developed software for the QNX Neutrino OS knows that we offer the QNX Momentics Tool Suite for creating and testing C and C++ applications. With the QNX CAR 2 application platform, this is still the case. Native-level services are built with the QNX Momentics suite, and HTML5 applications are built with our new HTML5 SDK. We've decided to offer the suite and the SDK as separate packages so that app developers who need to work only in the HTML5 domain needn't worry about the QNX Momentics Tool Suite and vice versa. Together, these toolkits allow you to create HTML5 user interface components with underlying native services, where required.
Attending SAE Convergence? Here’s why you should visit booth 513
Cars and beer don’t mix. But discussing cars while having a beer? Now you’re talking. If you’re attending SAE Convergence next week, you owe it to yourself to register for our “Spirits And Eats” event at 7:00 pm Tuesday. It’s the perfect occasion to kick back and enjoy the company of people who, like yourself, are passionate about cars and car electronics. And it isn’t a bad networking opportunity either — you’ll meet folks from a variety of automakers, Tier 1s, and technology suppliers in a relaxed, convivial atmosphere.
But you know what? It isn’t just about the beer. Or the company. It’s also about the Benz. Our digitally modded Mercedes-Benz CLA45 AMG, to be exact. It’s the latest QNX technology concept car, and it’s the perfect vehicle (pun fully intended) for demonstrating how QNX technology can enable next-generation infotainment systems. Highlights include:
Here, for example, is the digital cluster:

And here is a closeup of the head unit:

And here’s a shot of the cluster and head unit together:

As for the engine sound enhancement and high-quality hands-free audio, I can’t reproduce these here — you’ll have come see the car and experience them first hand. (Yup, that's an invite.)
If you like what you see, and are interested in what you can hear, visit us at booth #513. And if you'd like to schedule a demo or reserve some time with a QNX representative in advance, we can accommodate that, too. Just send us an email.
But you know what? It isn’t just about the beer. Or the company. It’s also about the Benz. Our digitally modded Mercedes-Benz CLA45 AMG, to be exact. It’s the latest QNX technology concept car, and it’s the perfect vehicle (pun fully intended) for demonstrating how QNX technology can enable next-generation infotainment systems. Highlights include:
- A multi-modal user experience that blends touch, voice, and physical controls
- A secure application environment for Android, HTML5, and OpenGL ES
- Smartphone connectivity options for projecting smartphone apps onto the head unit
- A dynamically reconfigurable digital instrument cluster that displays turn-by-turn directions, notifications of incoming phone calls, and video from front and rear cameras
- Multimedia framework for playback of content from USB sticks, DLNA devices, etc.
- Full-band stereo calling — think phone calls with CD quality audio
- Engine sound enhancement that synchronizes synthesized engine sounds with engine RPM
Here, for example, is the digital cluster:

And here is a closeup of the head unit:

And here’s a shot of the cluster and head unit together:

As for the engine sound enhancement and high-quality hands-free audio, I can’t reproduce these here — you’ll have come see the car and experience them first hand. (Yup, that's an invite.)
If you like what you see, and are interested in what you can hear, visit us at booth #513. And if you'd like to schedule a demo or reserve some time with a QNX representative in advance, we can accommodate that, too. Just send us an email.
Car Connectivity Consortium (CCC) MirrorLink meeting, Chicago, September 29, 2011
For those who aren't yet reset on "MirrorLink", it's the new term for what previously was called TerminalMode. The name change is a definite improvement. I informally polled people to ask them what they though when they first heard "Terminal Mode". Basically the answers fell into two camps: either a telnet replacement or a disease you really don't want your doctor saying that you have. Neither sound like a real ringing endorsement! MirrorLink as a term makes sense. Good job CCC.
Here are some of my observations and notes from the CCC show in Chicago last week.
- MirrorLink will not be going away any time soon--there is enough industry momentum to keep it alive for a while. Sounds like roughly 60% of the car makers and 60% of the mobile makers are behind it to some extent or another.
- QNX is very bullish on HTML5 as a replacement for MirrorLink-like features, but it doesn’t look like HTML5 is part of the future MirrorLink strategy at all. Instead, they’re looking at HDMI or MDL—direct video from the mobile with a control channel. This is a generic replacement for iPod out, and it's an approach that we've considered as well and will likely support, so this is a good alignment at least in direct video technology. Even though they don't see the wisdom of the HTML5 path yet (patience--they'll get there :-).
- OEMs don’t seem to realize how badly this will impact their revenue chain or are taking the "cross your fingers" approach. Certainly many seemed to be focused solely on the value MirrorLink provides by enabling customers and building new markets. I think it's somewhat Pollyanna-ish to not admit MirrorLink has the potential to completely decimate in-vehicle navigation uptake. If I can bring my phone in for navigation for a half-way decent experience with a built-in screen, who's going to spend $3000 on a nav-only solution?
- MirrorLink isn’t as focused as much on enabling third party apps (although they did talk about it), but more about mirroring custom-built phone apps into the car. Everything that was demoed in the demo room breakout was a customized app that provided an integrated experience. This is both bad and good. Bad because it definitely reduces the short-term promise of opening up a huge third party ecosystem. Good because I think it's the only reasonable way to go--there's really no other way OEMs can justify the liability of phone apps within the car, unless they can have some measure of control.
- I still think that there's a significant amount of work they need to address safety concerns around driver distraction. MirrorLink the specification, and the general CCC communications contains "driver-safe" messaging. However, my take is that the actual participation level people, especially on the mobile side seem to discount their accountability when you bring third party apps into the car, and nothing in the specification really makes it possible for an automotive outsider to make a car-safe app. I highly doubt this approach will fly. The application level certification that is planned for a future MirrorLink 1.1 release seems almost a mandatory requirement before this issue can be put to bed.
- Interestingly, almost every car company I talked to had a different take on how MirrorLink will impact their strategy—some see it as a low-end only play, but others see it as a high-end play. There's still a lot of confusion as to where it slots into product lines. I didn’t talk to anyone there who isn’t going to do it at all (not surprising given that the show was completely MirrorLink-focused), but some didn't seem to put a lot of weight behind it. The perception I had was that some were doing it to "keep up with the Joneses."
- I give the CCC credit for realizing that MirrorLink has a lot of danger for the fragmentation whirlpool that has plagued Android releases and that makes Bluetooth interop the biggest nightmare for those who implement it and test it. To that end, they're really trying to take this one head-on. It's still very early days to see if they will be successful, with the first MirrorLink 1.0.1 systems coming out in production. (Alpine's aftermarket ICS-X8 earns that "first to market" distinction.) I hold out hope that CCC can keep MirrorLink interop from becoming a quagmire, but this is a bugger of a problem to fix in an area that tries to tie "slow-moving" car tech with the mobile space, so keep your eyes peeled...
What’s next for the connected car?
It’s been almost three years since QNX Software Systems launched its connected car concept, and I thought it would be an interesting exercise to look at what has been accomplished in the automotive industry around the connected car and how some of the concepts are evolving. When the QNX CAR Application Platform was introduced, we provided a simple way to look at a connected car, using four “dimensions” of connectivity:
Connected to consumer devices, connected to the cloud
Why lump these two together? There is not exactly a clear line between the two since consumer devices are often just extensions of the cloud. If my car connects to a smartphone which, in turn, draws information from the cloud, is there much point in creating a distinction between consumer device and cloud connections? Although it made sense to differentiate between cloud and consumer device connections when phones provided only handsfree calling and simple music playback, today the situation is quite different.
Device integration into the car has been a beehive of activity over the last few years. Smartphones, superphones, and tablets are providing entertainment, social networking, news, and access to a myriad of other content and applications to consumers anywhere, anytime. Automakers want to take advantage of many of these capabilities in a responsible, non-distracting way.
The primary issue here is how to marry the fast-paced consumer electronics world to the lifecycle of the car. At present, there are solutions at the opposite end of the spectrum: standardized Bluetooth interfaces that allow the car to control the smartphone; and screen replication technologies (iPod Out, VNC/Terminal Mode/Mirror Link) where the smartphone takes control and uses the car as a dumb display.
Neither of these scenarios takes full advantage of the combined processing power and resources of the car and the brought-in device. This, to me, is the next phase of car and cloud connectivity. How can the power of the cloud, brought-in devices, and the in-car systems be combined into a cooperative, distributed system that provides a better driver and passenger experience? (If the notion of a distributed system seems a bit of a stretch, consider the comments made by Audi Chairman Rupert Stadler at CES 2011.)
When looking for technologies that can bring the cloud, devices, and car together, you do not need to look any further than the web itself. The software technologies (HTML5, Javascript, AJAX, peer-to-peer protocols, tooling, etc.) that drive the web provide a common ground for building the future in car experience. These technologies are open, low cost, widely known and widely accessible. What are we waiting for?
Connected within the car
The integrated cockpit has emerged as a prevalent automotive design concept. It is now commonplace in higher-end vehicles to see seamless integration between center stack functions and the instrument cluster and driver information displays. For example, turn-by-turn directions, radio tuning, and song now playing are all available to the driver on the cluster, reducing the need to constantly glance over to the main display. One such example is the Audi cluster:
These Advanced Driver Assist Systems (ADAS) will become more common as cost reductions take place and the technology is provided in lower-end vehicles.
In contrast, vehicle-to-infrastructure communication that requires industry-wide collaboration is proceeding at a pace that you’d expect from an internationally standardized solution.
- Connected to portable consumer devices for brought-in media and handsfree communications
- Connected to the cloud for obvious reasons
- Connected within the car for sharing information and media between front and rear seats, between the center stack and the cluster, and for other similar functions
- Connected around the car for providing feedback to the driver about the environment surrounding the car, be it pedestrians, other cars, or vehicle to infrastructure communications
Connected to consumer devices, connected to the cloud
Why lump these two together? There is not exactly a clear line between the two since consumer devices are often just extensions of the cloud. If my car connects to a smartphone which, in turn, draws information from the cloud, is there much point in creating a distinction between consumer device and cloud connections? Although it made sense to differentiate between cloud and consumer device connections when phones provided only handsfree calling and simple music playback, today the situation is quite different.
Device integration into the car has been a beehive of activity over the last few years. Smartphones, superphones, and tablets are providing entertainment, social networking, news, and access to a myriad of other content and applications to consumers anywhere, anytime. Automakers want to take advantage of many of these capabilities in a responsible, non-distracting way.
The primary issue here is how to marry the fast-paced consumer electronics world to the lifecycle of the car. At present, there are solutions at the opposite end of the spectrum: standardized Bluetooth interfaces that allow the car to control the smartphone; and screen replication technologies (iPod Out, VNC/Terminal Mode/Mirror Link) where the smartphone takes control and uses the car as a dumb display.
Neither of these scenarios takes full advantage of the combined processing power and resources of the car and the brought-in device. This, to me, is the next phase of car and cloud connectivity. How can the power of the cloud, brought-in devices, and the in-car systems be combined into a cooperative, distributed system that provides a better driver and passenger experience? (If the notion of a distributed system seems a bit of a stretch, consider the comments made by Audi Chairman Rupert Stadler at CES 2011.)
When looking for technologies that can bring the cloud, devices, and car together, you do not need to look any further than the web itself. The software technologies (HTML5, Javascript, AJAX, peer-to-peer protocols, tooling, etc.) that drive the web provide a common ground for building the future in car experience. These technologies are open, low cost, widely known and widely accessible. What are we waiting for?
Connected within the car
The integrated cockpit has emerged as a prevalent automotive design concept. It is now commonplace in higher-end vehicles to see seamless integration between center stack functions and the instrument cluster and driver information displays. For example, turn-by-turn directions, radio tuning, and song now playing are all available to the driver on the cluster, reducing the need to constantly glance over to the main display. One such example is the Audi cluster:
Connected around the car
Three years ago, systems in this category were already emerging, so there really wasn’t much of a crystal ball required here. Adaptive cruise control has become one of the most common features that illustrate how a car can connect to its surroundings. Adaptive cruise control detects the car’s surroundings (cars in front of you) and adjusts your speed accordingly. Other examples include pedestrian detection (offered in Volvo S60 and other models)automatic parking, lane departure warning, and blind spot detection/warning systems. These Advanced Driver Assist Systems (ADAS) will become more common as cost reductions take place and the technology is provided in lower-end vehicles.
In contrast, vehicle-to-infrastructure communication that requires industry-wide collaboration is proceeding at a pace that you’d expect from an internationally standardized solution.
MirrorLink misunderstood: 8 myths that need busting

MirrorLink is intended to extend the life of in-vehicle systems by allowing them to interact with mobile content and to support new features that didn’t exist when the car rolled off the assembly line.
Here's an illustration of how it works:

MirrorLink in-car communication. The protocol between the head unit and the phone can run over several transports, including USB, Bluetooth, or Wi-Fi. This example assumes Bluetooth for the audio back-channel.
When I talk to people in the automotive and mobile industries, I find they share a number of common misconceptions about MirrorLink, which I’d like to clear up. So let's get started, shall we?
- MirrorLink is an Android technology. In fact, MirrorLink works with multiple mobile platforms. Phones using Android can support it, but so can phones from any other phone maker that supports the standard. Even Apple phones could support it, though Apple has currently chosen to go their own route with Apple-specific solutions.
- MirrorLink allows any mobile app to run in the car. This is incorrect. A MirrorLink app can run in the car only if the car maker grants “trust” to that app. Each car maker has a different concept of what brands to promote, what features are safe, or what works well with each car. So, in reality, each app will be enabled depending on the individual make — or even model — of car.
- MirrorLink promotes “driver distracting” apps. Also incorrect. MirrorLink is an enabling technology that doesn’t promote any type of app in particular. In fact, because the car maker must grant trust to an app, the app developer can't control what apps run in the car. That responsibility remains the domain of car makers, who tend to avoid anything that will cause distraction when displayed on a front-seat screen.
- MirrorLink is the only way to connect an app to the car. There are in fact two others: iPod Out and HTML5. Apple supports iPod Out for Apple devices, which allows selected applications to output analog video to the head unit. (Note that the new iPhone 5 doesn’t support iPod Out.) HTML5 also allows mobile apps to run in the head unit, though its use in car-to-phone bridging is still in the early stages. QNX Software Systems has demonstrated concept vehicles that use BlackBerry Bridge (an HTML5-based technology) to connect an HTML5 app on a BlackBerry phone to the car’s head unit.
- Mobile app makers will benefit most from MirrorLink. In fact, car makers may end up taking best advantage of the technology. That’s because they can use MirrorLink to customize and create apps, and to refresh those apps as a way of delivering fresh, new functions to their customers. MirrorLink gives them the ability to do this using a standardized protocol supported by most mobile platforms. Car makers could use MirrorLink very effectively, even if they never allowed any third party apps into their cars.
- HTML5 and MirrorLink are incompatible. Not necessarily true. Current versions of MirrorLink use the VNC protocol to exchange graphical data. None of the advantages of HTML5 would be incompatible with a future version of MirrorLink; in fact, some members of the Connected Car Consortium (CCC), including QNX Software Systems, would likely be interested in merging these two standards. That would result in a new version of MirrorLink that uses HTML5 as the underlying communication protocol. (The MirrorLink specification is controlled by the Car Connectivity Consortium, of which QNX is a member.)
Even if MirrorLink does go to HTML5, the industry would still need a VNC-based form of MirrorLink. VNC has much lighter requirements on the head-unit side, so it makes more sense than HTML5 if the car doesn’t have a high-powered CPU or lots of memory. The broadest possible option would be to have phone apps support multiple versions of MirrorLink (today's version with VNC plus a future version with HTML5) and to use whichever one makes sense, depending on what the car supports. - MirrorLink obviates the need for car-downloadable apps. Yes, MirrorLink capability is somewhat similar in purpose to downloading apps into the car; they both extend the functionality of the car after it leaves the factory. Because the customer’s phone will almost certainly be newer than the car’s electronics, it will have a faster CPU, giving the raw speed advantage to a MirrorLink app on the mobile. The MirrorLink app will also have guaranteed data access since the hosting phone will always have a data pipe — something that isn't certain on the car side of the equation.
On the other hand, MirrorLink doesn’t give an app access to car features that would available to a car-downloaded app — features such as vehicle bus access, telematics features, or the navigation system. Also, a car-downloaded app would likely have a faster HMI than any off-board app, even if the mobile had a faster CPU, because of latencies inherent to screen replication. The car-downloaded app would also have better visual integration, as it could take full advantage of the car features, instead of appearing as a bolt-on product. Other factors, based on automaker control, compatibility, or product roadmaps could also favor an in-car solution. Even if you could address some of these issues, there would still be enough reasons for MirrorLink and an auto app store to live side-by-side. - MirrorLink apps can be built today. This is technically true. But, in their enthusiasm, new converts can sometimes forget that cars need to support MirrorLink for anything to actually work. Currently, only aftermarket car stereos support MirrorLink; no production vehicles support it. So if you’re a mobile app developer, the market for MirrorLink apps today is negligible. But expect this situation to improve dramatically over the next two to three years as production vehicles start to ship with this capability built-in.
QNX CAR 2 — the extended version
The world of video is a ruthless one; just as we posted the QNX CAR demo it was out of date.
But, hold on a minute. As I write this I realize it’s not the video world at all; it’s the software world that creates new technology at breakneck speed. And QNX certainly does its part.
The QNX CAR 2 application platform has come a long way in a matter of months. We needed to update the original video to keep pace with the technology but also to address customer demand for more detailed information.
So this video is a step-by-step demo – definitely not for the tire kicker. But if you really want details on what automakers and tier ones can achieve with QNX CAR technology, hit play.
But, hold on a minute. As I write this I realize it’s not the video world at all; it’s the software world that creates new technology at breakneck speed. And QNX certainly does its part.
The QNX CAR 2 application platform has come a long way in a matter of months. We needed to update the original video to keep pace with the technology but also to address customer demand for more detailed information.
So this video is a step-by-step demo – definitely not for the tire kicker. But if you really want details on what automakers and tier ones can achieve with QNX CAR technology, hit play.
Garmin taps QNX technology to create K2 infotainment platform
Complete digital cockpit delivers navigation, diagnostics, streaming media, smartphone integration, and voice recognition
This just in: Garmin International has selected the QNX CAR platform to power the Garmin K2, a next-generation infotainment solution for automakers.
Most people are familiar with Garmin's many portable GPS devices, from sports watches to action cameras to PNDs. But the K2 is a different animal altogether — a complete “digital cockpit” that comprises multiple digital displays, on- and off-board voice recognition, smartphone integration, and optional embedded 4G connectivity.
The K2 is designed to give drivers simple, intuitive access to navigation, vehicle diagnostics, streaming media, and realtime Web information. It's also designed with scalability in mind, so automakers can use it to address diverse market requirements and cost targets.
According to Matt Munn, managing director of Garmin’s automotive OEM group, “the QNX CAR platform has played a major role in helping us to achieve our goal of providing both world-class software reliability and flexible access to emerging consumer applications. From the proven stability and performance of the QNX architecture to the company’s worldwide industry recognition, QNX was the logical choice.”
Other key features of the K2 include a 3D-enhanced city model, a predictive services calendar, and remote personalization and control via a web portal or smartphone.
Here's the K2 at a glance:
And here's a demo of the system, filmed by Engadget at 2013 CES:
For more information on this announcement, read the press release. And for more on the K2 itself, visit the Garmin blog.
![]() |
Paul Leroux |
Most people are familiar with Garmin's many portable GPS devices, from sports watches to action cameras to PNDs. But the K2 is a different animal altogether — a complete “digital cockpit” that comprises multiple digital displays, on- and off-board voice recognition, smartphone integration, and optional embedded 4G connectivity.
The K2 is designed to give drivers simple, intuitive access to navigation, vehicle diagnostics, streaming media, and realtime Web information. It's also designed with scalability in mind, so automakers can use it to address diverse market requirements and cost targets.
According to Matt Munn, managing director of Garmin’s automotive OEM group, “the QNX CAR platform has played a major role in helping us to achieve our goal of providing both world-class software reliability and flexible access to emerging consumer applications. From the proven stability and performance of the QNX architecture to the company’s worldwide industry recognition, QNX was the logical choice.”
Other key features of the K2 include a 3D-enhanced city model, a predictive services calendar, and remote personalization and control via a web portal or smartphone.
Here's the K2 at a glance:
![]() |
Source: Garmin |
And here's a demo of the system, filmed by Engadget at 2013 CES:

For more information on this announcement, read the press release. And for more on the K2 itself, visit the Garmin blog.
QNX Acoustics for Voice — a new name and a new benchmark in acoustic processing
![]() |
Tina Jeffrey |
Designed as a complete software solution, the product includes both the QNX Acoustics for Voice signal-processing library and the QWALive tool for tuning and configuration.
The signal-processing library manages the flow of audio during a hands-free voice call. It defines two paths: the send path, which handles audio flowing from the microphones to the far end of the call, and the receive path, which handles audio flowing from the far end to the loudspeakers in the car:

QWALive, used throughout development and pre-production phases, gives developers realtime control over all library parameters to accelerate tuning and diagnosis of audio issues:

A look under the hood
QNX Acoustics for Voice 3.0 builds on QNX Software Systems’ best-in-class acoustic echo cancellation and noise reduction algorithms, road-proven in tens of millions of cars, and offers breakthrough advancements over existing solutions.
Let me run through some of the innovative features that are already making waves (sorry, couldn’t resist) among automotive developers.
Perhaps the most significant innovation is our high efficiency technology. Why? Well, simply put, it saves up to 30% both in CPU load and in memory requirements for wideband (16 kHz sample rate for HD Voice) and Wideband Plus (24 kHz sample rate). This translates into the ability to do more processing on existing hardware, and with less memory. For instance, automakers can enable new smartphone connectivity capabilities on current hardware, without compromising performance:

Another feature that premieres with this release is intelligent voice optimization technology, designed to accelerate and increase the robustness of send-path tuning. This technology implements an automated frequency response correction model that dynamically adjusts the frequency response of the send path to compensate for variations in the acoustic path and vehicle cabin conditions.
Dynamic noise shaping, which is exclusive to QNX Acoustics for Voice, also debuts in this release. It enhances speech quality in the send path by reducing broadband noise from fans, defrost vents, and HVAC systems — a welcome feature, as broadband noise can be particularly difficult for hands-free systems to contend with.
Flexibility and portability — check and check
Like its predecessor (QNX Aviage Acoustic Processing 2.0), QNX Acoustics for Voice 3.0 continues to offer maximum flexibility to automakers. The modular software library comes with a comprehensive API, easing integration efforts into infotainment, telematics, and audio amplifier modules. Developers can choose from fixed- and floating-point versions that can be ported to a variety of operating systems and deployed on a wide range of processors or DSPs.
We’re excited about this release as it’s the most sophisticated acoustic voice processing solution available to date, and it allows automakers to build and hone systems for a variety of speech requirements, across all their vehicle platforms.
Check out the QNX Acoustics for Voice product page to learn more.
The summer road trip of 2017 — Part I
![]() |
Lynn Gayowski |
Tunes for the road
A road trip without a soundtrack is a road trip I want no part of. I think we can all agree that a Britney Spears playlist is compulsory. Music has always been intimately connected to the driving experience (see the Highway Hi-Fi Phonograph below for proof
The media sources that you depend on today — local drives, USB storage devices, smartphones, cloud services — will work seamlessly with your vehicle, allowing you and your passengers to enjoy any genre from any source. Conventionally constrained to your center stack, music meta-data will permeate all the screens of your car, even the instrument cluster.
The context-aware cockpit
The road trip of years past was plotted on a paper map and required a navigator in the passenger seat; today’s passengers are relieved of these duties as navigation and route plotting have gone digital. But even with that convenience, having to divert your eyes from the road to the center stack can be a nuisance. The dashboard of 2017 will offer greater convenience with a driver centric-display that could blend navigation and digital cluster information all on one screen. These vehicles will be "context aware" and display different information depending on the environment. For instance, surround-view cameras could detect pedestrians or cyclists and provide a minimalist on-screen alert to minimize driver distraction. Similarly, the system may disable certain functionality when the driver is about to navigate a hairpin turn. If the vehicle “knows” there’s a challenge ahead related to road condition, visibility, local speed limits, traffic, or topographical information, it could display the appropriate context-relevant information to the driver.
Staying mobile
By 2017, you’ll probably have a new smartphone and, regardless of the platform, it’ll be able to communicate with your car. Projection mode technologies will be commonplace and render your phone’s display and services onto your car’s center stack (one example is QNX-powered Audi’s MMI mobile media application framework). This integration will no doubt get even more advanced in the coming years, and with Apple’s CarPlay and Google’s Android Auto connectivity protocols taking form, your favorite apps will be as at home on your dash as they are in your hand.
Your phone will also be able to control and monitor your car in new ways via the much-discussed, but sometimes nebulous, cloud. For instance, let’s say you find yourself at a behemoth rest stop and can’t remember the location of your car after indulging in the roadside cuisine. Your phone’s “key fob” app could tell you exactly where your car is — it could even let you check your oil and washer fluid remotely to see if your car is in shape to make it on the next of your leg of your trip.
How is in-car technology playing a role in your current summer road trip? How do you want it to improve your future road trips? What’s your favorite road trip destination? (My personal favorite is Washington, DC
For safety’s sake, why don’t cars just disable phones?

Using technology to control inappropriate phone use has been a topic at some of the driver distraction meetings I've attended. One proposed solution involves a technique called micro location — using ultrasonic waves to identify where in the cabin the phone is located. There are other ways to triangulate the phone's position, but they all require coordination between the phone and car. Knowing where the phone resides in the car is a requirement, as most passengers wouldn’t be happy to have their phone automatically disabled, just because they’re in the car. And the solution can’t be based only on the GPS speed of the phone, or you’d have lots of irate bus, taxi, train, or subway riders.
The fact is, unless all phone makers and car makers agree on the same standard, there's no incentive for either side to build half of a feature. You’d need to deploy potentially expensive technology that wouldn’t work unless you pair exactly the right phone with the right car. This likely won't happen unless companies are legislated to do so.
Given the speed of automotive development, it’s impossible for the car guys to build a technology that the phone guys won't leave in the dust, unless some guarantees are put in place. The adoption of Bluetooth is a good example. It took years before Bluetooth became widespread in phones, but its adoption had more to do with Bluetooth earpieces, not connections to cars. Car makers took a long time to roll out Bluetooth support as a standard feature because too many phones either didn't have it or had an implementation that wasn't fully compatible. Eventually, the two markets synchronized, but it took several years.
One argument against a technology-mandated disable is that not all jurisdictions agree on what is, or isn’t, allowable. In the US, 45 out of 50 states have some form of prohibition against using phones in cars. But what is disallowed varies widely by state — some don't allow any use of the phone (even hands-free), some prohibit teenagers but no other age groups, some disallow texting but not hands-free, some disallow use for commercial vehicles but not private vehicles, and some allow everything.
Another argument against a technological solution is that people can be educated to assume responsibility for their behavior. For example, why don't all cars have a blood alcohol level blow-tester hooked up to the ignition? Technically it's possible, but it's very expensive to do it from the car maker's standpoint. One could argue that it is worth it to have cars protect us from ourselves. But as a society, we've decided that, in the case of drunk driving, we are willing to give people back the responsibility. Rather than control the problem with technology, we socialize and educate people that driving intoxicated is an undesirable behavior.
We could, of course, decide to do the same with mobile technology, by educating personally instead of solving technically. This approach may make more sense than a technology-based prohibition: technology always moves at light speed compared to legislative mechanisms of control.
Subscribe to:
Posts (Atom)