Archives For quality of data

Historically, a customs officer’s “intuition” backed up by his/her knowledge and experience served as the means for effective risk management. In the old days (20 years ago and back) there wasn’t any need for all this ‘Big Data’ mumbo jumbo as the customs officer learnt his/her skill through painful, but real-life experience, often under bad and inhospitable conditions.

Today we are a lot more softer. The age of technology has superseded, rightly or wrongly, the human brain. Nonetheless, governments thrive on their big-spend technology budgets to ensure the safety of their economies and supply chains.

No less, the big multinational corporations whose ‘in-house’ business is no longer confined by national boundaries or continents are responsible for the generation of huge amounts of data which need to extend to the limits of their operations. When the products of such business are required to traverse national boundaries and continents, their logistics and transport intermediaries, financiers, and insurers become themselves tied up in the vicious cycle of data generation and transfer, also spanning national boundaries to ensure those products arrive at their intended destinations – intact, in time and fit for purpose. Hence we have what as become known as the international supply chain.

It does not end there. Besides the Customs authorities, what about the myriad of other government regulatory authorities who themselves have a plethora of forms and information requirements which must be administered and approved prior to departure and upon arrival of goods at their destination.

Inefficiencies along the supply chain culminate in delays with added cost which dictates the viability for sale and use of the product during delivery. These may constitute what is called non-tariff barriers (or NTBs) which negatively impact the suppliers credibility in international trade.

The bulk of this information is nowadays digitised in some for or other. It is obviously not all standardised and structured which makes it difficult to align, compare or assimilate. For Customs it poses a significant opportunity to tap into and utilise for verification or risk management purposes.

The term ‘Big Data’ embraces a broad category of data or datasets that, in order to be fully exploited, require advanced technologies to be used in parallel. Many big data applications have the potential to optimize organizations’ performance, (and here we have it) the optimal allocation of human or financial resources in a manner that maximizes outputs.

The purpose of this paper is to discuss the implications of the aforementioned big data for Customs, particularly in terms of risk management. To ensure that better informed and smarter decisions are taken, some Customs administrations have already embarked on big data initiatives, leveraging the power of analytics, ensuring the quality of data (regarding cargos, shipments and conveyances), and widening the scope of data they could use for analytical purposes. This paper illustrates these initiatives based on the information shared by five Customs administrations: Canada Border Services Agency (CBSA); Customs and Excise Department, Hong Kong, China (‘Hong Kong China Customs); New Zealand Customs Service (‘New Zealand Customs’); Her Majesty’s Revenue and Customs (HMRC), the United Kingdom; and U.S. Customs and Border Protection (USCBP). Source: WCO