Little appetite for big data?

Andrew Smith (Hearthstone Investment Management Ltd, London, UK)

Journal of Property Investment & Finance

ISSN: 1463-578X

Article publication date: 4 July 2016

656

Citation

Smith, A. (2016), "Little appetite for big data?", Journal of Property Investment & Finance, Vol. 34 No. 4. https://doi.org/10.1108/JPIF-04-2016-0024

Publisher

:

Emerald Group Publishing Limited


Little appetite for big data?

Article Type: Editorial From: Journal of Property Investment & Finance, Volume 34, Issue 4.

A recurring complaint from those undertaking research in the property sector is that conclusions often have to be drawn from incomplete, out of date, inaccessible or otherwise imperfect information sources. Data inadequacies mean that even seemingly straightforward questions, such as identifying the volume of stock in a given market, are sometimes surprisingly difficult to answer with precision. This in turn influences the way real estate is viewed as an asset class, how it is valued and financed, how it is occupied and used, and also has implications for the quality of planning policies and other public policy decisions that affect (or are affected by) property.

Ease of access to good information also helps to explain some researchers' predilection for studies of data-rich listed property behaviour rather than comparatively data-poor direct investment. The following lament, from a recent study of investment performance drivers in the UK, is symptomatic: "A unique challenge for fund managers is the integration of real estate's mix of financial and physical characteristics into a formal framework of asset appraisal, portfolio construction and risk management. Type and region factors explain some of the variation in asset returns, but not the majority. The power of constructing a portfolio based solely around these two factors is limited, therefore, and places the greater emphasis on short-term stock selection rather than asset allocation […]. Other attributes are suspected of driving property returns, in particular, an asset's specification, location, current lease terms and tenant strength. But how can these factors be included in an investment process when there is a lack of consistent data measuring their impact on the drivers of asset returns?" (Frodsham, 2016).

Despite the growth and evolving maturity of property research as a profession and as an academic discipline, we still seem to struggle with challenges that would have been familiar to pioneers in the sector, working in the pre-digital 1970s and early 1980s.

The digital revolution was already well under way by 1995, when a study of commercial property data by the UK's Society of Property Researchers (McNamara et al., 1995) observed, "Property research relies heavily upon sample based data, which may or may not be representative of the population as a whole. Despite the efforts of the research community and advances in information technology, available sources are often generalised, fragmented and difficult to disaggregate".

The study identified a number of areas of weakness which remain in the UK today, including the following:

  • there is no comprehensive transaction-based index, and no real time transaction information system;

  • statistics for non-prime property (and therefore the bulk of the standing stock) are fragmented and relatively sparse; and

  • data on floorspace, occupancy and take-up are patchy and incomplete, particularly for anything other than Central London offices.

Researchers at that time were well aware of data shortcomings, and saw the potential for new, large scale, well-structured relational databases as a platform for research. Considerable efforts were already being made to develop data exchange protocols, geospatial referencing standards and agreed definitions to prepare the industry for the anticipated boom in multi-disciplinary property data analysis. As long ago as 2003 the OSCRE and PISCES organisations agreed to form a single, global operating structure and methodology (Realcomm, 2003), and these standards are now well-established, if under-used.

The surprise, perhaps, is that despite the optimism of the time and the subsequent huge advances in digital media and computer processing power, so little has changed in the intervening 21 years. A recent study by the RICS (Cook and Chatterjee, 2015) observed, "Our own sector collects data on valuations, building condition, construction costs, leases, maintenance programmes and operational costs. If we start to take data from public sources in any other country – such as data on land, property value, occupier information, utilities and more – and combine that with local data, our sector holds enormous potential to change how we work. Then, of course, there is commercially held data from investors, banks and corporations that could add even more value. Today data sets are disparate, disconnected and usually inaccessible. This will change quickly as business sees the potential of data".

The authors' logic is hard to refute, but if their optimism is to be justified, the limited progress to date suggests that something will need to change. To Cook (2015) "It is clear the built and natural environment professions need to work much more closely with the technology sector to partner, collaborate and build new skills that bring together people, place and technology within the context of professional advisory services".

Part of the problem in achieving this seems to be that potentially useful land and property data are frequently held in private databases, often painstakingly compiled over many years and closely protected for reasons of confidentiality or commercial advantage. While there have been successful national and international initiatives to encourage collaboration and data pooling, these remain the exception rather than the rule.

The barriers seem to be a combination of vested interests and a lack of will to do something about it. Certainly the costs of data storage and processing do not appear to be obstacles. For example, a McKinsey report five years ago (Manyika et al., 2011) indicated that just $600 would buy a disk drive large enough to hold all the world's music, and it pointed to 40 per cent annual growth in global data generation, supported by only 5 per cent corresponding growth in IT spending.

Now, the rise of the "big data" concept may at last be setting the scene for a breakthrough for real estate. Big data seek to combine processing power and specialist analytical skills to bring together huge, disparate and often incompatible data sets from different sources. If big data are to be "the next frontier for innovation, competition, and productivity" as the title of the McKinsey report suggests, it would seem important for the real estate industry, and researchers in the sector, to identify areas where the value of harnessing big data outweighs the perceived advantages of keeping data private, and to start exploiting them.

Applications for big data in real estate are numerous. Examples include, inter alia, market and investment analysis, valuation (including the evolution of improved hedonic models), credit risk analysis, economic and urban planning, infrastructure planning, tenant and consumer profiling, environmental benchmarking and risk profiling, and aspects of building design and construction. The process of unlocking and combining data sets to improve the toolkit for researchers should create new commercial opportunities for those with the vision and ability to identify and exploit them.

The opportunity has never been greater. The volume of potentially useful information from which to build big data resources continues to grow. The analytical skills and experience to harness the data successfully are developing too, and market globalisation is serving to increase awareness of data best practice from different markets around the world.

Big data have the capacity to fundamentally change the way users view property, through developments in artificial intelligence, new business models and different ways of working. Those of us who manage or measure real estate need to embrace those changes, and apply them to our own working practices, or risk being left behind.

Andrew Smith

References

Cook, D. (2015), "RICS futures: turning disruption from technology to opportunity", Journal of Property Investment & Finance, Vol. 33 No. 5, pp. 456-464

Cook, D. and Chatterjee, P. (2015), Our Changing World: Let's Be Ready, Royal Institution of Chartered Surveyors, London

Frodsham, M. (2016), Defining Investment Quality, Investment Property Forum, London

McNamara, P., Patterson, A., Smith, A. and Wyatt, P. (1995), "The adequacy and accuracy of commercial property data: the need for property data", Working Paper No. 1, Society of Property Researchers, London

Manyika, J., Chui, M., Brown, B., Bughin, J., Dobbs, R., Roxburgh, C. and Hung Byers, A. (2011), Big Data: The Next Frontier for Innovation, Competition, and Productivity, McKinsey Global Institute, Seoul, San Francisco, CA, London and Washington, DC

Realcomm (2003), "UK summit ignites US commercial real estate data standards movement", available at: www.realcomm.com/advisory/69/1/uk-summit-ignites-us-commercial-real-estate-data-standards-movement (accessed 12 April 2016)

Related articles