Big data vs important data in the maritime industry

7 min read

big data in maritime

There has been a lot of talk in recent years about Big Data in shipping and how it has the potential to transform the maritime industry but is it possible that traditional data and in particular important data is more important to overall operating efficiency and decision making?

Ships and the shipping industry have always generated lots of data which has been used by ship operators for a variety of purposes but mostly aimed at monitoring vessel performance and gaining a competitive advantage by getting maximum efficiency from their assets, cost reduction at ports, planning vessel maintenance and reducing fuel consumption.

Operating ships is a complex business and differs between sectors. Although it is recognised that around 90% of world trade involves carriage in ships, the demands of the liner sector, where ships can carry goods for thousands of shippers intent on satisfying consumer demands is very different from say the crude oil tanker sector which is a far less complex system involving far fewer ports and traders.

The bulk sector is different again with shipping companies favouring one or more trade types but needing to switch between several to ensure the ship remains profitable. Passenger ships – whether cruise or ferries – are another sector with entirely different operating needs.

Unforeseen events have the power to disrupt

Implementing big data is almost certainly of more use to the liner sector where the shipping and logistics industry can make the best use of big data analytics to predict future trade patterns but as has been demonstrated in the last two years, a single event can rapidly change conditions. Almost no one could have foreseen the pandemic and its impact on both shipping and data quality. 

The port congestion that has occurred this year as the shipping industry began to recover has forced many major shippers to reconsider their approach to logistics and there is now a big debate over whether the “Just in Time” principle for trade is still valid. In the future, the data gathered from the shipping and logistics industry during the pandemic may help avoid similar problems if such an event occurs again, but in the short term the impact may actually cause many to question the validity of the claims made for big data.

Big data from the ship

For the majority of ship operators regardless of sector, data concerned with ship operation is the most valuable type of data concerned as it is with vessel performance, operational efficiency and fuel consumption. A lot of the data gathered from ship sensors on the engine, cranes, and in the holds and tanks is potentially useful for equipment manufacturers making use of advanced data processing techniques for discovering hidden patterns in equipment behaviour that could avoid costly problems caused by a developing fault that might have been overlooked using traditional data interrogation methods. Fuel consumption data also aids decision making on decarbonisation efforts and for ships over 5,000gt affected by the EU and IMO fuel monitoring and verification regulations can form the basis for the reporting function.

Once all of this data would have consisted almost entirely of completed forms and reports. Today, modern communication systems allow for non-traditional data to be sent in real time. This can be in the form of videos or photographs highlighting a problem with equipment that is impacting overall operational efficiency in some way and enabling informed assistance to be given to help rectify.

Ship data is needed by many of the shore based staff in shipping companies. The operations department will need real time data as contained in the daily position reports so as to determine future employment operations. The human resources department needs to keep an eye on crew changes, wages and expiring certificates that may need to be arranged.

Changing ways increase data flow

Here again the pandemic has initiated a changing attitude to conventional business practice. More people are being forced to work remotely, and this has certainly increased the volume of data that is being generated. More to the point, whereas staff would once have discussed issues face to face, they are now being forced to communicate using Teams or some other form of video conferencing and also sharing documents and data by way of email.

These new work methods are in their infancy and increase the need for communication systems that carry cyber threats that have to be taken into consideration. Remote working also means that better management of data is needed. Whereas a physical file would normally be left with one person at a time, electronic data can be worked on simultaneously by several employees. That may imply improved efficiency, but it brings a risk of confusion with different versions of files and documents in circulation.

Better data flow management

What is needed is a better means of controlling the data flow. Products such as GTReplicate make it possible to manage the flow of many data feeds ship to shore, shore to ship and even tying the ship into cloud based storage like Sharepoint.

GTReplicate provides a solution to the replication of data between ship and shore, reducing time and administration required by IT, which ultimately can lead to lower costs and greater levels of compliance / assurance. Changes made to master files and documents on shore can be updated or replaced fleetwide, with delta replications ensuring only the amended data is transferred to the vessel.

The tool allows secure exchange of data across fleets without interruption, built in features within the application provides the user with the ability to modify transfer rates, ensuring large updates do not interfere with the connectivity experience of the crew and can scale to various connectivity scenarios.