We need a rock solid M2M service layer

February 18, 2012

The 10 Million dollar M2M question is how to support thousands of business processes in tens of thousands of businesses in an efficient and scalable way. Beecham Research’s M2M Sector Map (see Interesting reading) makes the point of the fragmented and complex market very well. Mobile operators typically have few services with very many users while most potential users of M2M will have industry specific or even company specific needs in relatively small numbers. This is why most connected devices in cellular networks today are terminals in large volumes (typically electricity meters or eReaders) with small ARPU but also little work required by the operator per terminal deployed. The issue with this is that the electricity meters are rolled out primarily due to political decisions and one can argue that eReaders, iPads, etc are just big mobile phones and not really M2M solutions.

The other type of deployments today are primarily those where the value gained is big enough to pay for integration, software development, customization, etc. And these are “real” M2M solutions leveraging the value of connecting things and putting computing on top.

I come to think of the “bankruptcy gap” in between the only two viable business models over time: low price/low cost/high volumes and high price/great perceived value/customization. In the bankruptcy gap you will find average products with average prices. There is an obvious risk today to address the bulk of the very fragmented M2M market: quite an effort to provide what the customer want and price sensitiveness due to not big or clear enough benefits is a scary combination.

This looks similar to the bankruptcy gap but with one big difference: the driver putting businesses across industries into the bankruptcy gap is commoditization of products and services. But in the case of M2M we are in the early days! How can this be?

I think the situation is dangerous since it threatens to once again leave us with a great idea, a lot of energy and efforts, poor results and many investments and opportunities wasted. To me the key reasons why we face an artificial bankruptcy gap in M2M now are:

– Parts of the solutions, like plain vanilla 2G data subscriptions, are more or less commodities today. Other connectivity for M2M like PLC, Satellite, Wi-Fi, Rfid, NFC and PSTN are not commoditized and combinations of them are complex to deal with.

– Today it is too much effort to develop, integrate and support the M2M applications. Robust, efficient, large scale service delivery platforms are needed supporting standardized complete development stacks, different networks and numerous APIs.

Good news is that there is progress in these areas. Most mobile operators have or will deploy Service Enablement Services (SES) taking care of horizontal requirements on top of the connectivity. Module and equipment vendors, independent start-ups and others are working on similar often cloud based offerings and some of them support combinations of different connectivity technologies. Many standards development organizations have recognized the need for a common cost-efficient M2M service layer that can be embedded in different hardware and software to provide robust connectivity between terminals and the application servers. The ITU Focus Group on Machine-to-Machine Service Layer, initially focused on e-health, announced January 16 is a good example.

The best way to avoid another M2M flop is to ensure strong collaboration in establishing a rock solid common M2M service layer with standardized protocols and APIs and to always start working on real customer problems to avoid brilliant answers to questions we don’t know.

Advertisement

M2M and PSI – Public Sector Information

February 14, 2012

The connecting part of M2M is not really the interesting one, it’s the enabler. The computing part is the interesting one and where most value is created. Connecting things becomes easier and easier technically, practically and financially. Meanwhile the computing power in the cloud is developing immensely fast. Making information collected from machines and other relevant sources available to Internet application developers using the computing power available in the cloud will push innovation to new heights provided security and integrity is taken care of properly. By utilizing computing power in the cloud, devices can be lighter, faster, cheaper and optimized for other things like interaction and usability in something quite similar to the good old client server architecture.

Governments around the world try to support their local high tech industry and most cities today have their local incubators, investment funds and support programs. A quick and quite affordable way for governments on all levels to push and support innovation is to provide access for developers to data produced in the public sector and to promote, maybe push, usage of modern innovative information technology. The access must be affordable and not too complicated for the developers.

EU issued already 2003 the PSI directive – Directive on the re-use of public sector information – which was built on the two key pillars of the internal market: transparency and fair competition. The directive defined minimum rules for re-use of PSI and recommended states to go beyond these rules and adopt open data policies. Several countries including Sweden have been chased by the Commission for slow or poor implementation of the PSI directive which in the case of Sweden is strange since we have had our legislation regarding freedom of information including the right to reprint official documents since 1766. In the most recent version of the directive, 2012, also museums, archives and libraries are covered in the scope.

Most data produced within governments remain there and their ability to attract developers to make innovative and useful applications and services for citizens and society are limited. The growing number of M2M solutions deployed will drastically increase the amount of useful data created why countries acting now will have a growing advantage to others.


M2M and SIM cards

February 8, 2012

With the GSM mobile phones came the SIM-card (Subscriber Identity Module) 1991 turning a subscription into a tangible thing that could be removed, put into another phone and stolen. Users could bring their GSM identity to another phone without involving the operator which is very convenient. By removing the payment relationship from the subscription and adding possibilities for pre-payment a brand new and very popular type of mobile service was invented. Adding some memory available for the user made it possible to bring data, typically phone numbers, along with the SIM card to another phone.

In the early days of mobile M2M operators sent single SIM cards in envelopes which added an administrative issue to the already complicated task of deploying M2M solutions. Today we have a range of solutions to deal with SIM cards for M2M deployments. Making SIMs smaller is important in the handset market and in some specific cases we can leverage this development also in the M2M market, but most often the size doesn’t matter. With iPhone4 came the micro-SIM and next in line is nano-SIM measuring approximately 12 by 9 millimeters, 30% smaller than the micro-SIM. The thickness of the nano-SIM is reduced about 15%. The standardization of the nano-SIM is expected to be implemented through ETSI by the end of the year and the first nano-SIM phones will probably hit the market 2013. This will help phone vendors create thinner devices and free up room for additional memory and larger batteries but unless we are dealing with really small devices, this will probably not be important for the M2M market.

A much more interesting development for the M2M market is the over-the-air (OTA) SIM update, accepted by GSMA earlier this year. This will enable device manufacturers to sell devices with SIM cards included from factory and provision the subscription afterwards in a secure fashion. Apple, Google and others have been pushing in this direction for obvious reasons but some mobile operators were quite negative to the idea. Now it seems like the M2M players are making this happen first. Industry expert Northstream predicted that the SIM cards will disappear in the cloud maybe already this year. In the M2M market this would make life much easier for vendors of things with embedded M2M connectivity. The M2M connectivity could be built in at manufacturing, associated with a specific operator at the local resellers and expensive field maintenance could be avoided as well. The OTA SIM will bring a lot of transparency to the mobile industry removing practical and financial barriers thus making life easier for everyone involved.


%d bloggers like this: