Maritime data analytics has developed from a passive look-up tool to the new world of digitalisation. Belatedly, shipbrokers are looking to retain control of their ‘unique’ data, but that ship has already sailed
Maritime data analytics is the term given to collecting, analysing and disseminating information on shipping activities. The core of the data is the ship specification, ownership, commercial operation and movements.
The first era of shipping data: the age of chit-chat
The first age of maritime data analytics started with chit-chat in Lloyd’s coffee houses and the subsequent publication of Lloyd’s List in 1734, literally listing vessels arriving the Pool of London, and their cargoes. This was the age of passive data, which was really only of interest to those in the shipping and insurance industries.
With hindsight, the use of passive shipping data probably peaked in the 1980s, when aggregators such as Lloyd’s of London Press (LLP), had exclusive access to Lloyd’s Agents in ports around the world, reporting back on vessel movements. LLP also published Lloyd’s List, registers of ships and shipping companies and the Confidential Index, a directory of beneficial owners, which was only available to select clients. Larger shipbrokers such as Clarksons also published ship registers, and every respectable broker’s office had shelves groaning with ship registers from different organisations, all telling a different story about the specification of the vessel.
The second era: investigating where are we in the shipping cycle
The start of the second age of maritime data analytics, the investigative age, was triggered by several events. One was the publication of Maritime Economics in 1988 by Dr Martin Stopford. Now those entering the shipping industry had an easy-to-understand textbook explaining supply and demand and shipping cycles. A new generation of shipping analyst developed tools to investigate the changes in shipping in vessel demand, but they needed a lot of data.
Aggregators like LLP collated ship details on a mainframe, cleaning out duplicates and performing basic analysis before selling the data at very high cost to government agencies. LLP extended the programming to published changes in sector size, fleet and hence cargo movement data in aggregate form in publications such as Lloyd’ Shipping Economist.
Meanwhile, the ever-innovative Dr Stopford and Cliff Tyler of Clarkson Research developed the Shipping Intelligence Weekly, which took the basic broker’s report and included indicative indices and text explaining what was happening in the shipping markets. Dr Stopford has likened collecting shipping data at the time as a car driving along a road with the windscreen blacked out. It is the Shipping Intelligence Weekly’s role to report the scenery passing by the front windows.
Then IMO adopted Resolution A.600(15) in 1987 calling for all ships to be given a unique number, aiming to prevent fraud, identify polluting vessels and enhance safety. The IMO number came into force in 1996. Now there was a unique identifier that stayed with the ship, no matter how many times it was sold or the name changed.
The arrival of the IMO number was the start of big data in shipping, but few realised the significance as the time. Which must be the explanation as to why awarding and recording the ship IMO number was not kept inhouse at IMO or tasked to an industry body like the Baltic Exchange, but was given to a private company.
With hindsight, this was a huge commercial advantage and the other publishers of shipping data were livid.
Meanwhile, the internet arrived, and suddenly vessel specifications and IMO numbers were on owners’ websites. This brought about an expansion in the research departments, with rooms full of analysts peering at screens to update ship specifications on an internal database and match sales in brokers’ reports, and names of vessels to those reported in Lloyd’s List and Tradewinds.
Brokers and some shipping consultancies were at the forefront of producing online shipping databases with analytical features, but these were only available to carefully chosen clients. There was considerable paranoia about data being stolen. Indeed, at least one shipbroker was taken to court over the theft of electronic data.
In the midst of all this distrust, Dr Stopford took an alternative path and chose to sell the Clarkson database of ships, prices and values through the Shipping Intelligence Network (SIN) at a reasonable price to whoever wanted it. For many years the easy-to-use SIN was the de facto shipping database, and there are probably hundreds of shipping analysts today who owe a debt of gratitude to SIN.
The third era: big shipping data
We are now in the third age of maritime data analytics, which is the age of prediction using algorithms and integrating several strands of data through digitalisation. The driving force behind the digitalisation of shipping data is often outsiders to shipping. This is perfectly summed up by the activity of CargoMetrics, a hedge fund based in Boston. In 2010, Dr Scott Borgerson, a US Coast Guard Academy graduate with an impressive array of educational qualifications had an idea on how to use the fact that demand for shipping is predicated on the micro and macro supply situation. Although he was a mariner, he had little experience of the commercial world of shipping.
So he set out to learn all about the commercial realities of shipping, and applied maths to the problem of predicting cargo flows, supply shortages and demand growth. CargoMetrics’ data scientists are sucking up data on commodities using customs data, vessel AIS movements (satellite and terrestrial), images and other raw data from hundreds of sources on a daily basis. The company has a dynamic register of ships (it is no secret that CargoMetrics is working with Clarkson Research Services) and has geo-tagged the global ports. This means CargoMetrics can track a ship to the berth and then cross-reference the cargo against customs data, images from ports and other data.
As a hedge fund, CargoMetrics is monetising all this research arbitraging the information by trading on hundreds of stock exchanges around the world. This is an outsider using algorithms to predict ship movements, with no interest in selling the data to those in shipping. Indeed, to do so would limit CargoMetrics ability to arbitrage.
What is the fourth era of maritime data analytics?
As a naturally conservative industry, there is a persistence from shipping insiders’ that the data they hold is unique and valuable. The recent negotiations between the Baltic Exchange and the shipbrokers to impose value and copyright on the data they supply reinforces this stereotype.
It is a question of scale and data requirements. Trading companies and their data suppliers are far larger than shipping companies, and far more profitable (the market cap of the largest shipping company, AP Moller-Maersk, is a quarter of that of any one of the top ten traders). These companies use this muscle to create trading platforms that use algorithms to best guess the likely employment of the fleet.
For example, Paris-based Kpler is scooping up data from AIS satellite feeds, customs data, ports agents, shipbrokers and traders. The service claims to be able to give the employment of 70% of the vessels in its system, and predict individual vessel movements and cargo flows. The platform was built for trading houses and is now being offered in the shipping market.
One characteristic of the fourth era in maritime data analytics will be workarounds for the problem of not having access unique data, and there is one obvious group of shipping insiders whose unique access to maritime data has not been exploited.
The crew know the intimate specification of the ship, probably better than the owner. They know the volume of cargo loaded, not some rounded-up figure broadcast by AIS, or some guestimate based on the AIS draught. They have the up-to-date position and speed.
The fourth era of maritime data will be characterised by how and when the trading companies choose to tap into this valuable information, and how the seafarer is rewarded for participating. One option would a smartphone app that broadcasts speed and position and offers incentives for crew to add higher levels of data. From chit-chat to passive data gathering to investigating the data to big data algorithms and digitalisation, the next era could be the crowd-sourcing of maritime data from those in the know.