.

Showing posts with label HTML5. Show all posts
Showing posts with label HTML5. Show all posts

HTML5 and the software engineer

HTML5 appears to have a number of benefits for consumers and car manufacturers. But what is often good for the goose is not necessarily good for the developer. Talking to the guys in the trenches is critical to understanding the true viability of HTML5.

Andy Gryc and Sheridan Ethier, manager of the automotive development team at QNX, pair up for a technical discussion on HTML5. They explore whether this new technology can support rich user interfaces, how HTML5 apps can be blended with apps written in OpenGL, and if interprocess communication can be implemented between native and web-based applications.

So without further ado, here’s the latest in the educational series of HTML5 videos from QNX.



This interview of Sheridan Ethier is the third in a series from QNX on HTML5.

Is HTML5 a good gamble?

As the consumer and automotive worlds continue to collide, HTML5 looks like a good bet. And not a long shot either. In fact, the odds are all automakers will eventually use it. But since the standard won’t be mature for some time yet, should you take a chance on it now? 

To answer this, Andy Gryc talks to Matthew Staikos of RIM. Matthew is the manager of the browser and web platform group at RIM, and has over 10 years of software development experience with a strong focus on WebKit for mobile and embedded systems. Matthew co-founded Torch Mobile, which was acquired by RIM for their browser technology.

Andy’s conversation with Matthew is the subject of the following video, the second in an educational series designed to get an industry-wide perspective on HTML5. 




This interview of Matthew Staikos is the second in a series from QNX on HTML5.

Open standards, open source, and why the difference matters

As Andy Gryc reported in a previous post, Paul Hansen of the Hansen Report asked six automakers whether they plan to ship products based on the GENIVI open source platform. Not one of them said yes.

This underwhelming response to open source may seem surprising, especially to people outside of the auto industry. It seems even more surprising when you consider the many companies that belong to the GENIVI Alliance — a veritable who’s who of high-tech and automotive companies, from ARM to IBM to Volvo. Why the disconnect?

A couple of reasons come to mind. First, the automotive market is exceedingly competitive. Asking automakers to collaborate on a common OS platform — the GENIVI approach — is arguably a non-starter. Also, many automakers seem to grasp that open source OSs don't necessarily address the issues that matter to them most.

Allow me to explain. Automotive companies entertain the option of using open source for several reasons. They want to avoid vendor lock-in. They want to leverage a large developer community. They want to access a rich toolset. And, in many cases, they hope to avoid the costs of runtime licensing.

Yes, open source can help address these requirements. But more often than not, open standards offer a better route to achieving what automakers really need.

Vendor neutral, OS neutral, hardware neutral
Take the goal of avoiding vendor lock-in. An open standard is, by definition, vendor neutral. It is typically the product of a collaborative and transparent process free of domination by a single company or interest group. Likewise, it isn’t controlled or maintained by a single, self-interested entity. HTML 5, for instance, isn’t owned by any one company, but is a standard embraced by Apple, Google, Microsoft, Mozilla, QNX, and others.

HTML5 isn't just vendor neutral; it's also OS and hardware neutral. By using it as an HMI and application environment, automakers gain the freedom to choose the best OS platform for the job at hand, and the option to migrate across platforms, if required. In other words, HTML5 enables automakers to use the platform that can offer fastest boot speed, the highest reliability, the best mobile device integration, or the best performance on automotive silicon — things that can reduce costs and improve the user experience. (To put this another way, the underlying OS platform is anything but a commodity — a fact demonstrated every day in the mobile device world.)

An open source platform may or may not share these characteristics. Even though developers can access the source code, a single entity may still control the technology’s roadmap and licensing terms. In effect, the platform can constitute a single point of failure for the automaker — exactly what automakers try to avoid. Compare this to an open standard, which is defined collaboratively and then supported over a long period of time. POSIX, with its 20+ year history, comes to mind.

Also, open standards like HTML5 are unencumbered by the protective licensing terms often associated with open source OSs — terms that can lead to greater system costs and complexity. For instance, the GNU Public License (GPL) that governs use of the Linux OS ensures that any modifications to the original program are released as open source. That's a problem for any OEM that doesn’t want to “open source” its technology; for instance, vehicle bus information. It is also incompatible with the certifications and licenses of consumer device manufacturers whose licensing terms are designed to prevent integration of their code with GPL code bases; iPod support and integration is a good example. Such technologies must, as a result, be separated into another virtualized OS or onto external hardware modules. The result is a more expensive and more complex system — another thing that automakers try to avoid.

Delivering the goods
Of course, all this hinges on whether a standard like HTML5 can deliver the goods. And from my perspective, it does. For instance, it can provide all the capabilities of a traditional HMI toolkit, including a rendering engine, content authoring and packaging tools, and sophisticated graphic transitions. But unlike proprietary solutions, it can also help automakers:

  • tap into a vast pool of apps and developers
  • integrate with mobile devices
  • build user interfaces that incorporate virtually any delivery model
  • customize the UX and simplify access to mobile apps
  • customize apps and the UX for context: park, creep, drive, etc.

In addition, HTML5 can, with the right platform, work in concert with other HMI technologies (Adobe AIR, OpenGL ES, Qt, etc.) and blend seamlessly with those technologies on same display. As a result, system designers can choose the most appropriate technology for each application.

Incorporating open source
So is open source a total non-starter in automotive? Absolutely not.

In fact, many standards incorporate open source. Let us once again consider HTML 5. While it is built on an open standard, many HTML5 implementations are developed using open source solutions. For instance, many of the current, industry-leading HTML5 solutions are built on Webkit, an open source solution governed by the Lesser GNU Public License or LGPL.

The point is, the most successful solutions will combine the best that open standards, open source, and proprietary platforms have to offer. But if you were mandate an “open” solution, an open standard would be the best to rally behind.



If you're interested in this topic, we recommend you listen to the webinar that Andrew gave last week, "In-vehicle product differentiation: open standards vs open source." — Ed.

The ultimate show-me car

The fifth installment in the CES Cars of Fame series. Our inductee for this week: a most bodacious Bentley.

It's one thing to say you can do something. It's another thing to prove it. Which helps explain why we create technology concept cars.

You see, we like to tell people that flexibility and customization form the very DNA of the QNX CAR Platform for Infotainment. Which they do. But in the automotive world, people don't just say "tell me"; they say "show me". And so, we used the platform to transform a Bentley Continental GT into a unique concept car, equipped with features never before seen in a vehicle.

Now here's the thing. This is the same QNX CAR Platform found in the QNX reference vehicle, which I discussed last week. But when you compare the infotainment systems in the two vehicles, the differences are dramatic: different features, different branding, different look-and-feel.

The explanation is simple: The reference vehicle shows what the QNX CAR Platform can do out of the box, while the Bentley demonstrates what the platform lets you do once you add your imagination to mix. One platform, many possibilities.

Enough talk; time to look at the car. And let's start with the exterior, because wow:



The awesome (and full HD) center stack
And now let's move to the interior, where the first thing you see is a gorgeous center stack. This immense touchscreen features a gracefully curved surface, full HD graphics, and TI’s optical touch input technology, which allows a physical control knob to be mounted directly on the screen — a feature that’s cool and useful. The center stack supports a variety of applications, including a 3D navigation system from Elektrobit that makes full use of the display:



At 17 inches, the display is big enough to display other functions, such as the car’s media player or virtual mechanic, and still have plenty of room for navigation:



The awesome (and very configurable) digital instrument cluster
The instrument cluster is implemented entirely in software, though you would hardly know it — the virtual gauges are impressively realistic. More impressive still is the cluster’s ability to morph itself on the fly. Put the car in Drive, and the cluster will display a tach, gas gauge, temperature gauge, and turn-by-turn directions — the cluster pulls these directions from the center stack’s navigation system. Put the car in Reverse, and the cluster will display a video feed from the car’s backup camera. You can also have the cluster display the current weather and current sound track:



The awesome (and just plain fun) web app
The web app works with any web browser and allows the driver to view data that the car publishes to the cloud, such as fluid levels, tire pressure, brake wear, and the current track being played by the infotainment system. It even allows the driver to remotely start or stop the engine, open or close windows, and so on:



The awesome (and nicely integrated) smartphone support
The Bentley also showcases how the QNX CAR Platform enables advanced integration with popular smartphones. For instance, the car can communicate with a smartphone to stream music, or to provide notifications of incoming email, news feeds, and other real-time information — all displayed in a manner appropriate to the automotive context. Here's an example:



The awesome everything else
I’ve only scratched the surface of what the car can do. For instance, it also provides:

  • Advanced voice rec — Just say “Hello Bentley,” and the car’s voice recognition system immediately comes to life and begins to interact with you — in a British accent, of course.
     
  • Advanced multimedia system — Includes support for Internet radio.
     
  • Video conferencing with realistic telepresence — Separate cameras for the driver and passenger provide independent video streams, while fullband voice technology from QNX offers expanded bandwidth for greater telepresence.
     
  • LTE connectivity — The car features an LTE radio modem, as well as a Wi-Fi hotspot for devices you bring into the car.

Moving pictures
Okay, time for some video. Here's a fun look at the making of the car:



And here's a run-through of the car's many capabilities, filmed by our friends at TI during 2013 CES:





What’s HTML5 got to do with automotive?

There’s been a lot of noise lately about HTML5. A September 2011 report by binvisions shows that search engines and social media web sites are leading the way toward adoption: Google, Facebook, YouTube, Wikipedia, Twitter, and plenty more have already transitioned to HTML5. Some are taking it even further: Facebook has an HTML5 Resource Center for developers and the Financial Times has a mobile HTML5 version of their website.

It won’t be long before HTML5 is ubiquitous. We think automakers should (and will) use it. 

To elucidate the technology and its relevance, we’ve created a series of educational videos on the topic. Here is the first in that series. Interviews with partners, customers, and industry gurus will soon follow. 



This simple overview is the first in a series from QNX on HTML5. (Personally I like the ending the best.)

Is this the most jazzed-up Jeep ever to hit CES?

The fourth installment in the CES Cars of Fame series. Our inductee for this week: a Jeep that gets personal.

Paul Leroux
It might not be as hip as the Prius or as fast as the Porsche. But it's fun, practical, and flexible. Better yet, you can drive it just about anywhere. Which makes it the perfect vehicle to demonstrate the latest features of the QNX CAR Platform for Infotainment.

It's called the QNX reference vehicle, and it's been to CES in Las Vegas, as well as to Detroit, New York City, and lots of places in between. It's our go-to vehicle for whenever we want to hit the road and showcase our latest infotainment technology. It even made a guest appearance at IBM's recent Information On Demand 2013 Big Data conference, where it demonstrated the power of connecting cars to the cloud.

The reference vehicle, which is based on a Jeep Wrangler, serves a different purpose than our technology concept cars. Those vehicles take the QNX CAR Platform as a starting point to demonstrate how the platform can help automakers hit new levels of innovation. The reference vehicle plays a more modest, but equally important, role: to show what our the platform can do out of the box.

For instance, we updated the Jeep recently to show how version 2.1 of the QNX CAR Platform will allow developers to blend a variety of application and HMI technologies on the same display. In this case, the Jeep's head unit is running a mix of native, HTML5, and Android apps on an HMI built with the Qt application framework:



Getting personal
We also use the Jeep to demonstrate the platform's support for customization and personalization. For instance, here is the first demonstration instrument cluster we created specifically for the Jeep:



And here's a more recent version:



These clusters may look very different, but they share the same underlying features, such as the ability to display turn-by-turn directions, weather updates, and other information provided by the head unit.

Keeping with the theme of personalization, the Jeep also demonstrates how the QNX CAR Platform allows developers to create re-skinnable HMIs. Here, for example, is a radio app in one skin:



And here's the same app in a different skin:



This re-skinnability isn't just cool; it also demonstrates how the QNX CAR Platform can help automotive developers create a single underlying code base and re-use it across multiple vehicle lines. Good, that.

Getting complementary
The Jeep is also the perfect vehicle to showcase the ecosystem of complementary apps and services integrated with the QNX CAR Platform, such as the (very cool) street director navigation system from Elektrobit:



To return to the question, is this really the most jazzed-up Jeep to hit CES? Well, it will be making a return trip to CES in just a few weeks, with a whole new software build. So if you're in town, drop by and let us know what you think.

Seamless connectivity is for more than online junkies

As much as I’m not always enamored with sitting behind a computer all day, I find being off the grid annoying. Remember this email joke?

    You know you’re an online junkie when you:
    • wake up at 3:00 am to go to the bathroom and stop to check your email on the way back to bed
    • rarely communicate with your mother because she doesn’t have email
    • check your inbox. It says ‘no new messages,’ so you check it again 

Even though this joke circulated several years ago, it still strikes a chord. The big difference now is that there’s no longer a subculture of ‘online junkies.’ From the time we wake up in the morning to the time we go to bed, we all want to be connected — and that includes when we get behind the wheel. So to this joke I would add:

    • resent driving because it means going off the grid

At QNX, we’re working toward a seamless experience where people can enjoy the same connectivity whether they’re texting their spouse from the mall or checking traffic reports while driving down the highway. See what I mean:



For more information about the technology described in this video, visit the QNX website.
 

Can HTML5 keep car infotainment on track?

Paul Leroux
True story: When a train on the Trans-Mongolian Railway crosses from Mongolia into China, it must stop and have all of its wheel assemblies replaced. Why? Because the track gauge (distance between the rails) is 1520 mm in Mongolia and 1435 mm in China. Oops!

The rail industry realized long ago that, unless it settled on a standard, costly scenarios like this would repeat themselves ad infinitum. As a result, some 60% of railways worldwide, including those in China, now use standard gauge, ensuring greater interoperability and efficiency.

The in-car infotainment market should take note. It has yet to embrace a standard that would allow in-car systems to interoperate seamlessly with smartphones, tablets, and other mobile devices. Nor has it embraced a standard environment for creating in-car apps and user interfaces.

Of course, there are existing solutions for addressing these issues. But that's the problem: multiple solutions, and no accepted standard. And without one, how will cars and mobile devices ever leverage one another out of the box, without a lot of workarounds? And how will automakers ever tap into a (really) large developer community?

No standard means more market fragmentation — and more fragmentation means less efficiency, less interoperability, and less progress overall. Who wants that?

Is HTML5, which is already transforming app development in the desktop, server, and mobile worlds, the standard the car infotainment industry needs? That is one of the questions my colleague, Andy Gryc, will address in his seminar, HTML5 for automotive infotainment: What, why, and how?. The webinar happens tomorrow, November 15. I invite you to check it out.
 

Tech-nimble

After working more than 20 years in high tech, I've settled on a mantra: This too shall pass. (Hey, I didn’t say it was original!) To that end, patience is critical, as is flexibility. And ultimately, success depends less on predicting technology trends and more on responding to them. You've got to be tech-nimble, which requires not only the willingness to change, but the technology to accommodate — and profit from — that change.

Yesterday, Adobe announced a restructuring based on a change in direction, from mobile Flash to HTML5. Some might consider this development as proof that Adobe lost the battle to Steve Jobs. But to my mind, they've simply recognized a trend and responded decisively. Adobe has built a product portfolio based heavily on tooling, including tools for HTML5 development. So they definitely fall into the tech-nimble category.

QNX has an even greater responsibility to remain tech-nimble because so many OEMs use our technology as a platform for their products. Our technology decisions have an impact that ripples throughout companies building in-car infotainment units, patient monitoring systems, industrial terminals, and a host of other devices.

So back to the Flash versus HTML 5 debate. QNX is in a great position because our universal application platform approach enables us to support new technologies quickly, with minimal integration effort. This flexibility derives in part from our underlying architecture, which allows OS services to be cleanly separated from the applications that access them.

Today, our platform supports apps based on technologies such as Flash, HTML5, Qt, native C/C++, and OpenGL ES. More to the point, it allows our customers to seamlessly blend apps from multiple environments into a single, unified user experience.

Now that’s tech-nimble.
 

Adobe’s out of mobile? Read the fine print

The blogosphere is a-buzz with Adobe’s apparent decision to abandon Flash in mobile devices. I get the impression, though, that many people haven’t bothered to read Adobe’s announcement. If they did, they would come away with a very different conclusion.

Let me quote what Adobe actually said (emphasis mine):

    "Our future work with Flash on mobile devices will be focused on enabling Flash developers to package native apps with Adobe AIR for all the major app stores. We will no longer adapt Flash Player for mobile devices to new browser, OS version or device configurations. Some of our source code licensees may opt to continue working on and releasing their own implementations. We will continue to support the current Android and PlayBook configurations with critical bug fixes and security updates."

What’s being discontinued is the Flash plug-in for mobile browsers. Adobe will still support and work on Mobile AIR, and on the development of standalone mobile applications.

A number of cross-platform applications today are implemented in Adobe AIR, and that’s staying the same. Adobe is being smart — they’re picking and choosing their battles, and have decided to give this one to HTML5. We’re big believers in HTML5, and Adobe's announcement makes complete sense: Don’t bother with the burden of Flash plug-in support when you can do it all in the browser. You can still build killer apps using Adobe AIR.
 

What's the word on HTML5?

Ten videos on HTML5 in the car. Actually, there are only nine — but I'm getting ahead of myself.

Paul Leroux
Has it been two years already? In November 2011, a group of my QNX colleagues, including Andy Gryc, launched a video series on using HTML5 in the car. They realized that HTML5 holds enormous potential for automotive infotainment, from reducing industry fragmentation to helping head units keep pace with the blistering rate of change in the mobile industry. They also realized it was important to get the word out — to help people understand that the power of HTML5 extends far beyond the ability to create web pages. And so, they invited a variety of thought leaders and industry experts with HTML5 experience to stand in front of the camera and share their stories.

All of which to say, if you're interested in the future of HTML5 in the car, and in what thought leaders from companies such as OnStar, Audi, Gartner, Pandora, TCS, and QNX have to say about it, you've come to the right place. So let's get started, shall we?


Interview with Steve Schwinke of OnStar
Andy Gryc catches up with Steve Schwinke, director of advanced technology for OnStar, who is bullish on the both the short- and long-term benefits of HTML5:




Interview with Mathias Haliger of Audi
Derek Kuhn of QNX sits down with Mathias Haliger, head of MMI system architecture at Audi AG, who discusses the importance of HTML5 to his company and to the industry at large:




The analyst perspective: Thilo Koslowski of Gartner
Andy gets together with Thilo Koslowski, VP Distinguished Analyst at Gartner, to discuss the notion of controlled openness for the car — and how HTML5 fits into the picture:




Interview with Tom Conrad of Pandora
Andy meets up with Tom Conrad, CTO at Pandora, to get his take on the benefits of standardizing on HTML5 across markets:




Interview with Michael Camp of TCS
Andy Gryc sits down with Michael Camp, director of engineering for in-car telematics at TeleCommunication Systems (TCS), to get a software supplier's perspective on HTML5:




Interview with Matthew Staikos
Andy talks with Matthew Staikos, former web-technology manager at BlackBerry, about the impact of HTML5 on hardware options, memory usage, and app stores:




The myth buster interview
Andy and Kerry Johnson get together to discuss how HTML5 apps can deliver snappy performance, run without a Web browser, and even work without an Internet connection:




Interview with Sheridan Ethier
Andy drops in on Sheridan Ethier, manager of the QNX CAR Platform development team, to get a developer's perspective on HTML5:




Kickoff video
And last but not least, here is the video that started it all. Andy Gryc gives his take on why he believes HTML5 is destined to become the foundation for next-gen automotive apps:




Blooper video
Did I say last but not least? Sorry, I have one more video that you just have to see:




Top 8 questions for squeezing high-end tech into low-end infotainment

A couple of weeks ago, I hosted a webinar that addressed the question, “Is it possible to build an infotainment system that meets today's customer demands with yesterday's price tag?” The webinar explored several ways to reduce RAM and ROM requirements, eliminate hardware, and share hardware, all with the goal of cutting BOM costs.

As always, the audience asked lots of great questions, several of which I have answered here. Of course, these provide only a hint of the ground covered in the webinar, so I invite you to download the archived version to get the full details.

Built-in phone module versus brought-in smart phone: what is your take on this, and is a hybrid approach feasible?
The approach will vary from automaker to automaker. I think that embedded phones will be required for certain cars, especially if they use systems that rely on cloud-based services. This approach adds to the BOM cost, of course, but it may reduce overall cost, depending on what features can be off-loaded to the cloud.

Some brands will encourage brought-in devices as the lowest-cost alternative. The consumer will then have to deal with the setup and maintenance issues required to pair or charge the phone. I don’t see a clear-cut analysis that says one method will be better than the other — it really depends on what you want to accomplish.

Any thoughts on using MirrorLink to clone a virtual display to a remote physical LCD?
If you’re talking about a remote (as in cloud-based) device, I would say that HTML5 is a much more natural choice for a server-based application. If it’s a brought-in device, then MirrorLink or HTML5 could be appropriate.

If GPS is moved to a brought-in phone, how will a stolen vehicle be located?
It won’t be, unless the phone was left in the vehicle. This is one of the trade-offs you make when trying to reduce cost.

Of the cost-saving techniques you discussed, which are most likely to be used?
Already, some system designers are removing wake-up micros and DSPs. I’m not aware of any systems where the LCD has been removed, but this approach would probably offer the largest cost savings, making it a likely choice for entry-level systems and cost-sensitive markets.

Security and reliability are the main concerns of a head unit. Squeezing high-end technologies into low-end systems won’t relax those expectations. For instance, smart phone integration will be an add-on instead of replacing functionality of the head unit. Thoughts?
The trade-offs will need to be communicated to the customer. You can never build a head-unit augmented with a smartphone that works as reliably as the head unit operating by itself. As an OEM or Tier 1, you just don’t have enough control over the brought-in devices.

As an industry, we need to educate consumers. If they start relying on the phone in the car to provide certain features, then they will have to expect an inevitable degradation in overall system quality. It comes back to that famous adage: “cost, quality, or time — pick two”.

MirrorLink has a defined communication interface to the head unit. You also mentioned HTML5 as an option. Is there a defined standard yet for transmitting the HTML5 up to the head unit?
The interface between web server (i.e. phone) and web client (i.e. head unit) is already well established and tested. For some specialized features (for instance, allowing HTML5 code to access vehicle services) some standardization may be required. This will hopefully be a topic of discussion in November, at the automotive HTML5 workshop hosted by the W3C in Rome. Some Connected Car Consortium members have also discussed the possibility that, in the future, MirrorLink could add a transport mechanism based on HTML5.

You discussed peripheral sharing, using QNX transparent distributed processing. Does QNX TDP require secure authentication between distributed boxes?
No, it does not. It relies on standard POSIX user group permissions to provide access rights to devices.

Can you discuss any trends you see regarding Ethernet or TCP/IP in the vehicle?
Ethernet is definitely becoming more interesting in the vehicle, due to the introduction of Ethernet AVB. It makes a very natural replacement for audio-video transmission over MOST, and the additions to AVB that fulfil strict timing requirements can replace CAN or MOST for non-media vehicle messages. Ethernet also has obvious advantages when you need to access Wi-Fi networks, cloud services, and mobile devices.

My top moments of 2013 — so far

Paul Leroux
Yes, I know, 2013 isn’t over yet. But it’s been such a milestone year for our automotive business that I can’t wait another two months to talk about it. And besides, you’ll be busy as an elf at the end of December, visiting family and friends, skiing the Rockies, or buying exercise equipment to compensate for all those holiday carbs. Which means if I wait, you’ll never get to read this. So let’s get started.


We unveil a totally new (and totally cool) technology concept car
Times Square. We were there.
It all began at 2013 CES, when we took the wraps off the latest QNX technology concept car — a one-of-a-kind Bentley Continental GT. The QNX concept team outfitted the Bentley with an array of technologies, including a high-definition DLP display, a 3D rear-view camera, cloud-based voice recognition, smartphone connectivity, and… oh heck, just read the blog post to get the full skinny.

Even if you weren’t at CES, you could still see the car in action. Brian Cooley of CNET, Michael Guillory of Texas Instruments, the folks at Elektrobit, and Discovery Canada’s Daily Planet were just some of the individuals and organizations who posted videos. You could also connect to the car through a nifty web app. Heck, you could even see the Bentley’s dash on the big screen in Times Square, thanks to the promotional efforts of Elektrobit, who also created the 3D navigation software for the concept car.

We ship the platform
We wanted to drive into CES with all cylinders firing, so we also released version 2.0 of the QNX CAR Platform for Infotainment. In fact, several customers in the U.S., Germany, Japan, and China had already started to use the platform, through participation in an early access program. Which brings me to the next milestone...

Delphi boards the platform
The first of many.
Also at CES, Delphi, a global automotive supplier and long-time QNX customer, announced that version 2.0 of the QNX CAR Platform will form the basis of its next-generation infotainment systems. As it turned out, this was just one of several QNX CAR customer announcements in 2013 — but I’m getting ahead of myself.

We have the good fortune to be featured in Fortune
Fast forward to April, when Fortune magazine took a look at how QNX Software Systems evolved from its roots in the early 1980s to become a major automotive player. Bad news: you need a subscription to read the article on the Fortune website. Good news: you can read the same article for free on CNN Money. ;-)

A music platform sets the tone for our platform
In April, 7digital, a digital music provider, announced that it will integrate its 23+ million track catalogue with the QNX CAR Platform. It didn't take long for several other partners to announce their platform support. These include Renesas (R-Car system-on-chip for high-performance infotainment), AutoNavi (mobile navigation technology for the Chinese market), Kotei (navigation engine for the Japanese market), and Digia (Qt application framework).

We stay focused on distraction
Back in early 2011, Scott Pennock of QNX was selected to chair an ITU-T focus group on driver distraction. The group’s objective was serious and its work was complex, but its ultimate goal was simple: to help reduce collisions. This year, the group wrapped up its work and published several reports — but really, this is only the beginning of QNX and ITU-T efforts in this area.

We help develop a new standard
Goodbye fragmentation; hello
standard APIs.
Industry fragmentation sucks. It means everyone is busy reinventing the wheel when they could be inventing something new instead. So I was delighted to see my colleague Andy Gryc become co-chair of the W3C Automotive and Web Platform Business Group, which has the mandate to accelerate the adoption of web technologies in the car. Currently, the group is working to draft a standard set of JavaScript APIs for accessing vehicle data information. Fragmentation, thy days are numbered.

We launch an auto safety program
A two-handed approach to
helping ADAS developers.
On the one hand, we have a 30-year history in safety-critical systems and proven competency in safety certifications. On the other hand, we have deep experience in automotive software design. So why not join both hands together and allow auto companies to leverage our full expertise when they are building digital instrument clusters, advanced driver assistance systems (ADAS), and other in-car systems with safety requirements?

That’s the question we asked ourselves, and the answer was the new QNX Automotive Safety Program for ISO 26262. The program quickly drew support from several industry players, including Elektrobit, Freescale, NVIDIA, and Texas Instruments.

We jive up the Jeep
A tasty mix of HTML5 & Android
apps, served on a Qt interface,
with OpenGL ES on the side.
If you don’t already know, we use a Jeep Wrangler as our reference vehicle — basically, a demo vehicle outfitted with a stock version of the QNX CAR Platform. This summer, we got to trick out the Jeep with a new, upcoming version of the platform, which adds support for Android apps and for user interfaces based on the Qt 5 framework.

Did I mention? The platform runs Android apps in a separate application container, much like it handles HTML5 apps. This sandboxed approach keeps the app environment cleanly partitioned from the UI, protecting both the UI and the overall system from unpredictable web content. Good, that.

The commonwealth’s leader honors our leader
I only ate one piece. Honest.
Okay, this one has nothing to do with automotive, but I couldn’t resist. Dan Dodge, our CEO and co-founder, received a Queen Elizabeth II Diamond Jubilee Medal in recognition of his many achievements and contributions to Canadian society. To celebrate, we gave Dan a surprise party, complete with the obligatory cake. (In case you’re wondering, the cake was yummy. But any rumors suggesting that I went back for a second, third, and fourth piece are total fabrications. Honestly, the stories people cook up.)

Mind you, Dan wasn’t the only one to garner praise. Sheridan Ethier, the manager of the QNX CAR development team, was also honored — not by the queen, but by the Ottawa Business Journal for his technical achievements, business leadership, and community involvement.

Chevy MyLink drives home with first prize — twice
There's nothing better than going home with first prize. Except, perhaps, doing it twice. In January, the QNX-based Chevy MyLink system earned a Best of CES 2013 Award, in the car tech category. And in May, it pulled another coup: first place in the "Automotive, LBS, Navigation & Safe Driving" category of the 2013 CTIA Emerging Technology (E-Tech) Awards.

Panasonic, Garmin, and Foryou get with the platform
Garmin K2 platform: because
one great platform deserves
another.
August was crazy busy — and crazy good. Within the space of two weeks, three big names in the global auto industry revealed that they’re using the QNX CAR Platform for their next-gen systems. Up first was Panasonic, who will use the platform to build systems for automakers in North America, Europe, and Japan. Next was Foryou, who will create infotainment systems for automakers in China. And last was Garmin, who are using the platform in the new Garmin K2, the company’s infotainment solution for automotive OEMs.

And if all that wasn’t cool enough…

Mercedes-Benz showcases the platform
Did I mention I want one?
When Mercedes-Benz decides to wow the crowds at the Frankfurt Motor Show, it doesn’t settle for second best. Which is why, in my not so humble opinion, they chose the QNX CAR Platform for the oh-so-desirable Mercedes-Benz Concept S-Class Coupé.

Mind you, this isn’t the first time QNX and Mercedes-Benz have joined forces. In fact, the QNX auto team and Mercedes-Benz Research & Development North America have collaborated since the early 2000s. Moreover, QNX has supplied the OS for a variety of Mercedes infotainment systems. The infotainment system and digital cluster in the Concept S-Class Coupé are the latest — and arguably coolest — products of this long collaboration.

We create noise to eliminate noise
Taking a sound approach to
creating a quieter ride.
Confused yet? Don’t be. You see, it’s quite simple. Automakers today are using techniques like variable cylinder management, which cut fuel consumption (good), but also increase engine noise (bad). Until now, car companies have been using active noise control systems, which play “anti-noise” to cancel out the unwanted engine sounds. All fine and good, but these systems require dedicated hardware — and that makes them expensive. So we devised a software product, QNX Acoustics for Active Noise Control, that not only out-performs conventional solutions, but can run on the car’s existing audio or infotainment hardware. Goodbye dedicated hardware, hello cost savings.

And we flub our lines on occasion
Our HTML5 video series has given companies like Audi, OnStar, Gartner, TCS, and Pandora a public forum to discuss why HTML5 and other open standards are key to the future of the connected car. The videos are filled with erudite conversation, but every now and then, it becomes obvious that sounding smart in front of a camera is a little harder than it looks. So what did we do with the embarrassing bits? Create a blooper reel, of course.

Are these bloopers our greatest moments? Nope. Are they among the funniest? Oh yeah. :-)

A question of architecture

The second of a series on the QNX CAR Platform. In this installment, we start at the beginning — the platform’s underlying architecture.

In my previous post, I discussed how infotainment systems must perform multiple complex tasks, often all at once. At any time, a system may need to manage audio, show backup video, run 3D navigation, synch with Bluetooth devices, display smartphone content, run apps, present vehicle data, process voice signals, perform active noise control… the list goes on.

The job of integrating all these functions is no trivial task — an understatement if ever there was one. But as with any large project, starting with the right architecture, the right tools, and the right building blocks can make all the difference. With that in mind, let’s start at the beginning: the underlying architecture of the QNX CAR Platform for Infotainment.

The architecture consists of three layers: human machine interface (HMI), middleware, and platform.



The HMI layer
The HMI layer is like a bonus pack: it supports two reference HMIs out of the box, both of which have the same appearance and functionality. So what’s the difference? One is based on HTML5, the other on Qt 5. This choice demonstrates the underlying flexibility of the platform, which allows developers to create an HMI with any of several technologies, including HTML5, Qt, or a third-party toolkit such as Elektrobit GUIDE or Crank Storyboard.

A choice of HMIs
Mind you, the choice goes further than that. When you build a sophisticated infotainment system, it soon becomes obvious that no single tool or technology can do the job. The home screen, which may contain controls for Internet radio, hands-free calls, HVAC, and other functions, might need an environment like Qt. The navigation app, for its part, will probably use OpenGL ES. Meanwhile, some applications might be based on Android or HTML5. Together, all these heterogeneous components make up the HMI.

The QNX CAR Platform embraces this heterogeneity, allowing developers to use the best tools and application environments for the job at hand. More to the point, it allows developers to blend multiple app technologies into a single, unified user interface, where they can all share the same display, at the same time.

To perform this blending, the platform employs several mechanisms, including a component called the graphical composition manager . This manager acts as a kind of universal framework, providing all applications, regardless of how they’re built, with a highly optimized path to the display.

For example, look at the following HMI:



Now look at the HMI from another angle to see how it comprises several components blended together by the composition manger:



To the left, you see video input from a connected media player or smartphone. To the right, you see a navigation application based on OpenGL ES map-rendering software, with an overlay of route metadata implemented in Qt. And below, you see an HTML page that provides the underlying wallpaper; this page could also display a system status bar and UI menu bar across all screens.

For each component rendered to the display, the graphical composition manager allocates a separate window and frame buffer. It also allows the developer to control the properties of each individual window, including location, transparency, rotation, alpha, brightness, and z-order. As a result, it becomes relatively straightforward to tile, overlap, or blend a variety of applications on the same screen, in whichever way creates the best user experience.

The middleware layer
The middleware layer provides applications with a rich assortment of services, including Bluetooth, multimedia discovery and playback, navigation, radio, and automatic speech recognition (ASR). The ASR component, for example, can be used to turn on the radio, initiate a Bluetooth phone call from a connected smartphone, or select a song by artist or song title.

I’ll drill down into several of these services in upcoming posts. For now, I’d like to focus on a fundamental service that greatly simplifies how all other services and applications in the system interact with one another. It’s called persistent/publish subscribe messaging, or PPS, and it provides the abstraction needed to cleanly separate high-level applications from low-level business logic and services.

PPS messaging provides an abstraction layer between system services and high-level applications

Let’s rewind a minute. To implement communications between software components, C/C++ developers must typically define direct, point-to-point connections that tend to “break” when new features or requirements are introduced. For instance, an application communicates with a navigation engine, but all connections enabling that communication must be redefined when the system is updated with a different engine.

This fragility might be acceptable in a relatively simple system, but it creates a real bottleneck when you are developing something as complex, dynamic, and quickly evolving as the design for a modern infotainment system. PPS addresses the problem by allowing developers to create loose, flexible connections between components. As a result, it becomes much easier to add, remove, or replace components without having to modify other components.

So what, exactly, is PPS? Here’s a textbook answer: an asynchronous object-based system that consists of publishers and subscribers, where publishers modify the properties of data objects and the subscribers to those objects receive updates when the objects have been modified.

So what does that mean? Well, in a car, PPS data objects allow applications to access services such as the multimedia engine, voice recognition engine, vehicle buses, connected smartphones, hands-free calling, and contact databases. These data objects can each contain multiple attributes, each attribute providing access to a specific feature — such as the RPM of the engine, the level of brake fluid, or the frequency of the current radio station. System services publish these objects and modify their attributes; other programs can then subscribe to the objects and receive updates whenever the attributes change.

The PPS service is programming-language independent, allowing programs written in a variety of programming languages (C, C++, HTML5, Java, JavaScript, etc.) to intercommunicate, without any special knowledge of one another. Thus, an app in a high-level environment like HTML5 can easily access services provided by a device driver or other low-level service written in C or C++.

I’m only touching on the capabilities of PPS. To learn more, check out the QNX documentation on this service.

The platform layer
The platform layer includes the QNX OS and the board support packages, or BSPs, that allow the OS to run on various hardware platforms.

An inherently modular and extensible architecture
A BSP may not sound like the sexiest thing in the world — it is, admittedly, a deeply technical piece of software — but without it, nothing else works. And, in fact, one reason QNX Software Systems has such a strong presence in automotive is that it provides BSPs for all the popular infotainment platforms from companies like Freescale, NVIDIA, Qualcomm, and Texas Instruments.

As for the QNX Neutrino OS, you could write a book about it — which is another way of saying it’s far beyond the scope of this post. Suffice it to say that its modularity, extensibility, reliability, and performance set the tone for the entire QNX CAR Platform. To get a feel for what the QNX OS brings to the platform (and by extension, to the automotive industry), I invite you to visit the QNX Neutrino OS page on the QNX website.

Marking over 5 years of putting HTML in production cars

Think back to when you realized the Internet was reaching beyond the desktop. Or better yet, when you realized it would touch every facet of your life. If you haven’t had that second revelation yet, perhaps you should read my post about the Twittering toilet.

For me, the realization occurred 11 years ago, when I signed up with QNX Software Systems. QNX was already connecting devices to the web, using technology that was light years ahead of anything else on the market. For instance, in the late 90s, QNX engineers created the “QNX 1.44M Floppy,” a self-booting promotional diskette that showcased how the QNX OS could deliver a complete web experience in a tiny footprint. It was an enormous hit, with more than 1 million downloads.

Embedding the web,
dot com style:

The QNX-powered Audrey
Also ahead of its time was the concept of a tablet computer that provided full web access. When I started at QNX, I was responsible for tablets, thin clients, and set-top boxes. The most successful of these pioneering devices was the 3COM Audrey kitchen tablet. It could send and receive email, browse the web, and sync to portable devices — incredibly sophisticated for the year 2000.

At the time, Don Fotsch, one of Audrey’s creators, coined the term “Internet Snacking” to describe the device’s browsing environment. The dot com crash in 2001 cut Audrey’s life short, but QNX maintained its focus on enabling a rich Internet experience in embedded devices, particularly those within the car.

The point of these stories is simple: Embedding the web is part of the QNX DNA. At one point, we even had multiple browser engines in production vehicles, including the Access Netfront engine, the QNX Voyager engine, and the OpenWave WAP Browser. In fact, we have had cars on the road with Web technologies since model year 2006.

With that pedigree in enabling HTML in automotive, we continue to push the envelope. We already enable unlimited web access with full browsers in BMW and other vehicles, but HTML in automotive is changing from a pure browsing experience to a full user experience encompassing applications and HMIs. With HTML5, this experience extends even to speech recognition, AV entertainment, rich animations, and full application environments — Angry Birds anyone?

People often now talk about “App Snacking,” but in the next phase of HTML 5 in the car, it will be "What’s for dinner?”!

 

BBDevCon — Apps on BlackBerry couldn't be better

Unfortunately I joined the BBDevCon live broadcast a little too late to capture some of the absolutely amazing TAT Cascades video. RIM announced that TAT will be fully supported as a new HMI framework on BBX (yes, the new name of QNX OS for PlayBook and phones has been officially announced now). The video was mesmerizing — a picture album with slightly folded pictures falling down in an array, shaded and lit, with tags flying in from the side. It looked absolutely amazing, and it was created with simple code that configured the TAT framework "list" class with some standard properties. And there was another very cool TAT demo that showed an email filter with an active touch mesh, letting you filter your email in a very visual way. Super cool looking.

HTML5 support is huge, too — RIM has had WebWorks and Torch for a while, but their importance continues to grow. HTML5 apps provide the way to unify older BB devices and any of the new BBX-based PlayBooks and phones. That's a beautiful tie-in to automotive, where we're building our next generation QNX CAR software using HTML5. The same apps running on desktops, phones, tablets, and cars? And on every mobile device, not just one flavor like iOS or Android? Sounds like the winning technology to me.

Finally, they talked about the success of App World. There were some really nice facts to constrast with the negative press RIM has received on "apps". First some interesting comparisons: 1% of Apple developers made more than $1000, but 13% of BlackBerry developers made more than $100,000. Whoa. And that App World generates the 2nd most amount of money — more than Android. Also very interesting!

I can't do better than the presenters, so I'll finish up with some pics for the rest of the stats...