New Zealand’s green data future stymied by lack of second
cable

New
Zealand will never be an exporter of data, but will continue
to suffer as major importers, until the country gets a
second fibre optic cable.

“By implication, we’re
missing out on some of the major opportunities hosting and
exporting cloud based services from massive data crunching
to film rendering, as well as storage,” says TUANZ chief
executive, Paul Brislen.

“What makes it worse is we
could provide increasingly demanded green power to run these
massive installations. Arguably, it would be a much better
use of the 15% of our total power that runs Tiwai
Aluminium.”

Brislen says government and others are
making a fundamental mistake in thinking a second cable
would only be handling incoming data. That’s a feed to
only 4 million people.

The ability, which we don’t have
at the moment, to export data, to billions, completely
changes the equation Brislen says.

“It’s a bit of a
chicken and egg argument, but we’re totally forgetting the
use we as inventive Kiwis will make through being able to
easily and cheaply export data to the world,” he says.

A second cable would guarantee redundancy and an instant
backup for those companies willing to build and service
server farms and the river of data streaming from
them.

Brislen says that throwing in the green data cloud
storage opportunity makes a second cable most attractive.

“The other part of the infrastructure for all this is
the use of multicore computers and parallel programming and
the total IT architecture change that that brings about,”
says Brislen.

Multicore World 2013, a two-day
conference with an internationally-recognised cast of
speakers on the opportunities, challenges and implications
of many core processors on one chip takes place on February
19 & 20 at Wellington Town Hall.

“I’d encourage as
many people as can, to get to the conference,” says
Multicore World 2013 founder Nicolas Erdody. “It’s the
only place, certainly in the southern hemisphere, where
you’d learn about and understand what’s happening with
the intersection of fundamental drivers of the next data and
communications revolution.”

“I can imagine too,
Multicore World’s break-time conversations will be
extremely interesting. As always, that’ll be where
connections and deals are made, insights
gained.”

Multicore World 2013 boasts speakers who are
authorities on computer architecture which allows parallel
processing and massively increased computer, smartphone and
other device performance.

What is
multicore?The ability of computers to process
massive amounts of data has been growing ever since they
were invented.

As computer power has increased, the speed
of processing has reached a physical barrier, and more
processing power cannot be put onto a chip without
overheating.

The problem has been solved by putting more
processors onto a single chip, creating multicore chips.
These multicore chips entered the mainstream market a few
years ago, and all vendors currently sell them. They are now
standard kit in all laptops, desktops and
smartphones.

Multicore chips are also more power
efficient, and the number of cores able to be added is
theoretically virtually unlimited.

Previously impossible
computational tasks can now be achieved. And processes which
previously took, days or even weeks to perform can now be
done swiftly.

But while this new processing power enables
computers to do things faster, it also adds new challenges.

Before Multicore computer software was written for a
single central processing unit (CPU) in a chip. To exploit
the potential of multicore chips, software now needs to be
written while thinking in parallel.

But parallel
programming is different than traditional programming, and
so far few programmers have experience of it.

Multicore
is a mainstream but (as yet) niche new technology.

In the
next 10-15 years, there will be huge opportunities to
translate sequential programming (‘traditional’) legacy
code, and to create new software that takes full advantage
of thousands of cores in the next generation of chips.

Around the world parallel computing is currently used to
process vast quantities of data produced by the internet and
the "big data" originating out of social networks and
millions of intelligent data recording devices attached to
the internet.

Here in NZ it is also used in the biggest
CGI rendering facility in the world at Wellington's Weta
Digital.

And soon it will be a key component of the
information processing required to handle the data produced
by the Square Kilometer Array radio - telescope – a global
scientific project that New Zealand is a part of.

In
addition, there is a wide range of services, solutions and
systems integration challenges to connect the two world's
together.

Prime Minister John Key says it is “not the government’s preferred option” to make a fresh capital injection into the troubled state-owned coal miner, Solid Energy, but dodged journalists’ questions at his weekly press conference on whether that might prove necessary... More>>

NZCU Baywide says that once it was found to have committed a breach of a former staff member’s privacy, it had attempted to resolve the matter... the censure and remedies for its actions taken almost three years ago are “severe” but accepted, and will hopefully draw a line under the matter. More>>

PayPal has ceased processing payments for Mega, the file storage and encryption firm looking to join the New Zealand stock market via a reverse listing of TRS Investments, amid claims it is not a legitimate cloud storage service. More>>

The New Zealand government's operating deficit was smaller than expected in the first six months of the financial year, as the consumption and corporate tax take rose ahead of forecast in December, having lagged estimates in previous months. More>>