The idea and development of tools around wireless customers events and measurements is rapidly moving forward. Despite the fears of big brother watching, which any technology has the ability to be used for less than good means, geolocation data of subscribers has the ability to truly revolutionize how wireless operators do business. The concept of being able to know where everything in your network occurs at any one time allows for better design and optimization of the network as well as targeted marketing.
The issue is the perception of what you are getting versus the reality of it. 3G technologies for instance had soft handover so in theory for ~40% of the time you had a means of triangulation. Utilize network delays (Transmit time to receive time) you could get a general accuracy of within two hundred meters. Two football fields close. Of course the further the site density the lower the accuracy. All said though if you had enough samples, the accuracy of single samples is overcome by the sample size itself allowing for usable results.
LTE brought back a different challenge of single radio server and really only having delay to determine distance. The LTE network feature that could give 20m data is and most likely will be disabled for years ahead. Disabled due to the adverse affect to the consumer device battery life by utilizing the Ue GPS measurements regularly.
If you were provided two maps. The first being of event with high accuracy on just the distance from the cell from time delay information while the other is geolocation estimation. The second map would be much more visually pleasing versus the second (see below). Sampling engineering management would have the second map driving decisions, while the first they would just use as an input to a decision. Now what decisions would be made in error from the second map for your network? How much capital spend could be misdirected?
Figure 1: Two methods of representing potential geolocated data
With the continued buzz phrase of Small Cells for wireless, imagine the adverse affects on location planning may occur. If you only get 100-150m of coverage radius from a true small cell, will 400m (being generous) of accuracy get your small cell built in the correct location? The geo data is still valuable a valuable input to the process, just not as an absolute. Other known factors for the area will also be required to truly identify the correct spots.
Network optimization is even more critical for improved accuracy. Geolocation only works if you are on the network. Deadspots caused by outages, limited site density or over tilted antennas tend not to be represented in geotools, mostly because there is nothing to show. And with only general area accuracy in reality, these holes can be quickly covered up with sampling error. If this data is thought to drive Self Organizing/Optimizing Network (SON) activities for physical network changes (tilts, azimuths, power, etc.) then improvement of the customer experience may be missed entirely.
Geolocation is a game of statistics and algorithms. Vendors will tell you how good their tools are, operator management will fall into the potential efficiencies gained, all the while the perception of accuracy and use cases from it may result in a large wasteful spend. In the end, geolocation is a tool (still a good tool) and an input for decisions. A input tool for field and engineering validation. Time is also friend of geolocation. OEM messaging, improved battery life and GPS type features in continued algorithm improvement and tuning by the tool vendors will make the results better. Engineering organization just need to understand the limitations today versus the potential of tomorrow.