“I would say that 2G is not going away anytime soon, but if you are concidering new M2M installations today I would definetly recommend going with 3G compatible devices and to use 2G/3G compatible SIM cards. There is no guarantee of a maintained functionality in 2G networks looking ten years down the road (a normal life span for many M2M installations), and 2G CAPEX and OPEX will decline with all operators. Even if you take a slightly higher investment initially using a 3G device you will avoid the risk of having to exchange both SIM card and modem during the life span of your installation. Besides, prices on 3G modules are dropping and are rapidly approaching similar price levels as 2G units. This is of course dependent on the life span of you M2M installation and the amount of data you need to transfer. From an operator point of view, I am curious of which industry will first make use of 4G networks and M2M to rapidly transfer streaming video for specific surveillance tasks or other data intensive applications.”
Today some 95% of the mobile M2M connections are 2G. It is absolutely natural since the functionality and capacity needed for most of today’s applications is fulfilled in 2G and the modules are substantially cheaper. An off the shelf 2G SIM card would cost around $3-4 per month plus $0,5-2/MB data transmitted (in Sweden) and CSD, GPRS and SMS is enough for most M2M applications today. But there are dark clouds in the horizon! We don’t know how far away or how fast they come forward, but they are definitely there. Let’s try look at what these clouds contain.
A typical M2M deployment would count on terminals to be in service more than five years, often 10 or even 15 years. That’s long time! It is about 20 years since the GSM services came to market, and betting on the same networks to still be there with great coverage and good service might be something to consider carefully. A customer service person at my previous mobile operator in Stockholm told me: “sorry but we don’t invest in the 2G network anymore”. One of my companies, Possio, help mobile operators to move analog devices from the fixed network, PSTN, to mobile networks using primarily circuit switched connections in 2G (CSD). They experience operators, one by one, deciding not to introduce any new CSD based services in their network. They keep the existing ones, but obviously not for ever. I believe CSD will not disappear over night but this is worth looking into when making the bets
The connect part of M2M is the least interesting and rewarding. It is the compute part that makes the difference. IP is today, by all means, the dominating communication platform across all industries. The IP development environment is solid and rich, application support endless and skill is really everywhere, from developers to support people. An M2M bet today should in most cases be built on IP and one should really try understand if performance in 2G GPRS/EDGE will be enough for making all wanted computing during the life cycle. It is easy to foul yourself when it comes to performance and capacity. My first business trip with IBM went to Copenhagen 1983 where serious old men unanimously stated that with this capacity nothing is stopping us any longer. This was an ISDN conference.
The end-of-life problem is always something to take into account. Module manufacturers normally bring to market new pin compatible modules for their most popular models. But one day they will issue an end-of-life notice and then it is last order date and finally the spot market to rely on before it is over. In other words, when a market decrease it’s a chicken race between the module vendors. They not only want to understand how fast the market disappears (remember they have good numbers to watch) but they also want to ensure the best moment to bring their customers forward on a new platform and not lose them to a competitor.
The cost of modules for 2G or 3G differs a lot. As of today a 3G module would be roughly double the price of a 2G module and the difference could be $25-30. That is a lot especially if you need many. But it is important to look at the entire cost envelope, both capex and opex, over time. The cost of the actual deployment is normally high since it takes human beings to prepare the installation, to ensure other people involved are available, to get and verify permission for entrance and finally to go on site. Each installation is obviously different depending on industry, security levels, distances, type of application, etc but it can easily take a couple of hours per terminal which would translate into hundreds of dollars. One of my companies is active in retail environments where they often experience a lot of problems especially with access permission and coordination with other people needed (electricians, operator staff, alarm staff, etc). This is why we need to get it right first time – we can’t afford to go back – and why the installations will have to be operational for many years. When planning an M2M solution this might well be the most important aspect of the business case and the biggest risk for failure.
I believe this question – should I stay or should I go – is very important for all of us in the M2M business. There are no generic answers to the question about going 2G or 3G but it seems inevitable that sooner or later 3G will be the primary network why focus and investments thus quality and coverage in 2G networks will erode. How fast this happens is of course also depending on geography. In order to put more light on this important question I will ask a couple of knowledgeable individuals from within the industry about their views and post them here.
One way of describing M2M solutions is in the three steps: collect data, process data, use data. These three steps need to have standardized interfaces in order to avoid re-inventing the wheel every time we need something from the M2M solution in place. This is true when data is collected, processed and used internally but even more so when some or all data should be made available to someone else. M2M solutions are most often deployed internally with a business case developed to support the investment. They are typically there to respond to very specific internal challenges or opportunities why it’s not fully natural to think about sharing data externally when the systems are designed. However I believe we will see increasing business opportunities for owners of data collected in M2M (and other) solutions. I see at least three types of opportunities:
– making internal data available to selected or even all developers might boost perceived service levels for the company. A good example is tåg.info in Sweden where information about trains and stations are made easily available for developers who have developed popular apps like Train Info Sweden and Tågtavlan. I am convinced these apps have increased the overall perceived services by the train operators. Another great example is the “City of Stockholm Open API” where a lot of information is made available to developers and where I expect to see more and more information from M2M solutions in the city.
– making internal data available on commercial terms will probably be increasingly interesting. The more services made available, especially for smart phones and pads, the more important will quality of the service and differentiation become. And adding interesting data from a second and maybe innovative source might single out a specific service from the crowd. Let’s use weather forecasts as an example where the service with the best forecast quality can charge more or get most of the advertising money.
– making internal data available might impact the brand positively. An example could be creating indexes from the internal data which could be used by the public as comparisons and means of learning how to save energy at home, green driving, what to pay for things, etc.
Needless to say a very well thought through strategy has to be in place in order to avoid major mistakes like giving away the crown jewels or challenging peoples integrity.
In order for the big M2M boom to happen we need to make bits and pieces fit better together, end to end, in order to drive down cost for development and maintenance, to avoid duplication of efforts, to avoid fork lift upgrades of systems, simplify integration with partners and to improve time to market of services. Standardized and open API:s are important parts of that development and I think we will see an increasing amount of independent middle-men collecting data from different sources then cleaning and organizing the data in order to sell it. Such companies would help establishing standard API:s which is good for the M2M industry.