As part of our mission to help Scotland become a market leader in data innovation, we’re proud to welcome Julie Duncan as our dedicated Senior Business Development Manager for Scottish region.

With more than 15 years experience in the data sector, Julie’s will be primarily supporting our existing clients in Scotland, while growing Data8’s base nationally.

To carry on our blogging tradition, Julie has took some time
out of her busy schedule to introduce herself to the Data8 family.

Tell us about your background…

“After leaving school, I decided to take a break from my studies, and it was in that ‘break’ I naturally fell into my sales career. I spent the next 8 years working for a Scottish branch of a large organisation who specialised in IBM mid-range IT equipment (now known as the IBM i-series).”

“Looking for a change in direction, I moved across to Experian (formally QAS), and this is where I got hooked on all things data and became a true postcode geek.”

What did your Experian journey entail?

“During my time at Experian, I worked in various sales roles, initially growing their private sector base in Scotland, before moving on to manage the retention of their top tier address management accounts. My final role at Experian involved taking on a national position with a focus on growing the data quality portfolio for public sector accounts.”

“Throughout this journey, I was voted ‘Salesperson of the Year’ for 3 consecutive years, as well as being nominated twice for the Experian Elite programme, and winning some truly fantastic incentives including trips to New York, Marbella and Munich.”

What’s your plan for Data8?

“I am very excited to be joining Data8, where my role will be managing some of their existing clients whilst growing their Scottish base. Experian haven’t had a dedicated sales resource in Scotland in this area for many years, so there’s a real opportunity to capitalise on that gap through focused customer service.”

Summarise your time at Data8 so far

” I have thoroughly enjoyed my first few weeks here at Data8, and I can’t wait to bring innovation to Scotland’s data market.”

“It’s clear to see that Data8 is a truly dynamic company with a range of award winning products, and a strong focus on customer success.”

“I feel very privileged to join this growing company.”

Julie Duncan

And when you’re not talking about data?

“Outside of work, you’ll usually find me spending quality time with my family, running or enjoying wine with friends. “

To experience the benefits of partnering with our expanding, award-winning company for yourself, sign up for a free trial, or organise afree demo with our expert team today.

There’s no denying that the world’s commerce is quickly
moving online.

In fact, according to Statista, retail eCommerce sales grew by 23.3% in 2018, and on top of that, Nasdaq predicted that by 2040, around 95% of all purchases will be via eCommerce.

As a result of this, it’s essential for business owners to not only stay up to date with the latest eCommerce trends, but also meet the growing demand for first class, accurate and reliable shopping cart solutions.

Things
are changing for Magento

Magento is currently the world’s largest eCommerce platform,
powering around 245,000 websites around the globe.

Since the release of Magento 2 in 2015, it’s been announced that official support for the original platform will end in June 2020, meaning there will be no more security patches released and no more support with regard to the platform, its features and functionality.

It’s because of this, existing extensions have been re-built, with official merchant support to soon come to an end.

So with less than 10 months until the cut off, now’s the perfect time to not only migrate but also review your current plugins.

Where does Data8 come into this?

With the increase of eCommerce, and more specifically mCommerce, we’re now able to place orders quickly, on the go and in-between other tasks. But as a result of this, we’re also more likely to make typos, increasing the risk of submitting inaccurate data and delaying the order process.

1959 marks the making of many things. It was the year of the first coloured picture of Earth seen from outer space, the launch of the first weather satellite, and the trial of the modern Postcode.

While a lot has changed since 1959, Postcodes have remained a reliable system for the people of the UK. Celebrating its 60th Anniversary today, we’ve looked back on when the routing and delivery of mail wasn’t so simple.

In the early 1800’s people used post towns and county names to direct mail to its recipient, but as the demand for delivering letters and parcels increased, a better system was needed.

By the end of 1934, the UK was split into postal districts based on the points of the compass. Fast forward to 1959, and the reliable, memorable and efficient six-digit code we use today was successfully launched.

As digital innovation soared throughout the 1980’s, a database containing every address and Postcode in the UK was developed into what we know as the Postcode Address File (PAF).

Today, the PAF consists of over 30 million addresses and 1.8 million UK Postcodes, and Data8 helps countless organisations and individuals access it instantaneously in numerous aspects of everyday life – including online shopping, identity verification and locating geographic areas.

Data8 are delighted to have been awarded a place on the G-Cloud 11 framework for another successive year!

The G-Cloud iteration allows public sector organisations to find and purchase cloud-based services quickly and easily. That way, they can be confident knowing they’re only investing in advanced, accurate and secure cloud-based services from a reliable business.

As an approved supplier, our market-leading data validation, cleansing and Duplicare services are now searchable on the government’s Digital Marketplace. It’s thanks to this achievement, we’ve already helped countless high-profile government bodies transform their digital experience.

This is a testament to our stringent security standards and quality measures, and proof that our cutting-edge data services continue to make a difference to countless businesses across the UK.

To learn more about our past success stories, check out our recent case studies, or you can read more about the G-Cloud framework here.

]]>https://blog.data-8.co.uk/2019/07/24/g-cloud-application-approved-for-another-year-running-success/feed/0sophieevansdata8Welcome to the team Sophie!https://blog.data-8.co.uk/2019/07/19/welcome-to-the-team-sophie/
https://blog.data-8.co.uk/2019/07/19/welcome-to-the-team-sophie/#respondFri, 19 Jul 2019 13:29:15 +0000http://blog.data-8.co.uk/?p=3504Continue reading Welcome to the team Sophie!→]]>

As a growing company, we’re always looking for new ways to deliver an excellent customer experience to countless businesses worldwide – from helping companies realise the importance of data quality, right through to implementing the appropriate products and providing long-term, unsurpassed support. As part of this mission, we recently welcomed Sophie into our Marketing team and asked her to write a blog about her experience in the field and time here so far.

Back
in 2016, I received a First-Class degree in English and Media at Edge Hill
University. In the years following my academic training, I worked my way up
from Junior Copywriter to Senior Copywriter (Team Leader) at a leading audio
branding agency. From this, my natural
flair for the written word has grown to become a passion for creating compelling
content, and an eagerness to develop my writing within the world of marketing.

Today, I’m extremely pleased to be working at global data specialists, Data8, in my new role as a Digital Marketing Executive. Utilising my extensive writing experience, I plan to help Data8 communicate and engage with their existing and potential customers with simple, clear and informative content while driving their business online.

From
the moment I walked through the doors here at Data8, everyone has made me feel
extremely welcome and I feel incredibly lucky to be part of such an
unbelievably talented team. As I begin to navigate the countless benefits of
data quality and become familiar with the contributing factors to a business’
success – along with the innovative and pioneering products Data8 deliver – I can’t
wait to learn more in my new role, make an impact to the business and progress
in a career I love.

When
I’m not writing, you’ll usually find me watching reality TV, working up a sweat
at the gym, or getting lost in the pages of an enthralling thriller book. On
the weekend, I love an occasional glass of wine, trying out new restaurants and
hanging out with friends and family.

To experience the advantages of partnering with our expanding, market-leading team for yourself, sign up for a free trial today, and let the rewards speak for themselves.

In what is becoming a tradition, we ask a new starter to write a blog for us to induct themselves into the world of data8. This time it is the turn of Tess. Since writing this blog, she has discovered that she passed her degree with a 1st which is absolutely fantastic. Tess – we salute you.

My name is Tess and I recently started work at data8 as a software developer. I had just finished a Computer Science degree while raising my two children when I was offered a job.

My University course was in Thornton Science Park, my course
(which was about 98% male dominated) was fantastic and gave me opportunities to
learn about interesting modules like artificial intelligence and Cisco. The
experience of being one of very few females among students or indeed staff,
made me feel even more determined to finish the course and surpass average
results.

I was initially apprehensive due to finding programming very
daunting and was worried that I would struggle to keep up with the level of
information I would need to take in. After 2 days I was much less nervous as I
was given tasks to complete, but with no monitoring – this way I was able to
spend as long on each task as I wanted to without feeling pressured to finish
and being able to learn as much as possible by trying it as many ways as I
wanted.

I have felt welcome from day one, the equipment is great and there’s lots of space. I am loving that there’s a Dev Team meeting each week, so you know roughly what people are working on, or planning to work on.

On a personal note I’m a keen prosecco connoisseur, a total
foodie, enjoyer of gardening, hiker of countryside, reader of books and watcher
of films.

I’m very pleased to announce the release of enhanced usage
reporting for Data8’s suite of data validation web services.

From today, you will be able to retrieve more customised,
detailed breakdowns of your usage to help identify systems that are either not
making full usage of our services, or are generating unexpectedly high request
volumes.

On top of the previous daily usage totals you will now be
able to separate out usage by:

End-user client name (for our reseller partners)

Authentication Type

Protocol Type

Referer Domain

API Key

IP Address

Application Name

We support a variety of methods to integrate with our
services, and the authentication type and protocol type details will help you
to identify areas where you may be using older technology and where an upgrade
may offer you a range of new options.

To access the new usage reports, log into your Data8
Dashboard and select the service you want to get the usage report for. You will
then see the following options immediately below the usage chart:

Select the time period you want to get the report for, tick
the boxes you want to break down your usage by, and click Download.

These additional data fields are only available from 14th
May 2019, any earlier days will only have a single entry with all these fields
being blank.

Authentication Type

The available authentication types are currently:

UserPass – using your Data8 username and
password to authenticate your requests is the simplest method, but also the
least flexible

ApiKeyOld – an older format of API Keys, only
usable for client-side Javascript requests, we recommend you generate a new API
key

ApiKey – a flexible method of authenticating
client- or server-side requests, with options for restricting access by domain
name, IP address, service and rate limiting

Anyone seeing the ApiKeyOld authentication type should
migrate to the ApiKey option simply by generating a new API Key and updating
their integration configuration with their new key.

If you see the UserPass authentication type, you can also
switch to the ApiKey method simply by generating a server-side key, then change
your integration configuration to use the username “apikey-xxxx-xxxx-xxxx-xxxx”
(replacing xxxx-xxxx-xxxx-xxxx with your new API Key) and a blank password.

ADO – Requests made to the recordset.ashx
endpoint to load data into an ADO Recordset object, commonly from VB6 or
VBScript code

PROXY – Client-side Javascript requests made
using the loader.ashx script

HTTPPOST – REST requests made using HTTP POST
requests to the *.asmx endpoints

HTTPGET – REST requests made using HTTP GET
requests to the *.asmx endpoints

JSON – REST requests made using HTTP POST
request to the *.json endpoints

We would recommend clients migrating to the JSON endpoint in
most cases, as it provides the greatest flexibility for both client- and
server-side integrations, supports all the different authentication types and
has good support in all major development frameworks.

Referer Domain

If you are making client-side Javascript requests, this
option allows you to see how many requests are coming from which domain. This
doesn’t include the full URL, only the domain name. It does however
differentiate between subdomains, so “data-8.co.uk” and “www.data-8.co.uk”
would be listed separately.

API Key

You can generate as many API Keys as you like, and use
different keys to allow you to measure usage by different sites, applications,
users or any other metric that makes sense to you. Only new style API keys in
the format xxxx-xxxx-xxxx-xxxx will be listed here. If you are using an older
style API key, please generate a new one and update your integration
configuration to start taking advantage of all the improvements offered by the
new version.

IP Address

This lists the IP address that we receive the request from.
We do not record this for client-side Javascript requests, only for server-side
request. Each request can therefore either be separated out by either referrer
domain or IP address, but not both.

Application Name

This is a new flexible method of differentiating
usage by different applications under the same account. To start making use of
this, simply include an “ApplicationName” option in your requests. It is up to
you how you make use of this – it could be the name of the application that the
service is integrated with, or the name of the website, or a particular
functional area, or however you need to be able to measure your usage.

Big data is not going to be a
15-minute wonder in the fashion calendar. Data is the new black darling and is
here to stay! We are not talking about Dolce
& Gabanna using
drones on the runway to promote their new purses, or how Plein used
a robot to launch his fashion show.

There is already a huge amount of data being harnessed to allow designers to predict the next big trend, or to ensure that stock levels are being replenished within the various regions.

But with fast fashion getting
faster, and customers expecting, no, demanding quicker, better service and
delivery from online and the multiple touchpoints, to reach its audience at a
cheaper price than their competitors, fashion houses need to harness the
consumer data more efficiently, with minimum wastage and as smoothly as
possible.

Address Validation

So, the customer is
at the point of buying everything in their cart, and needs to enter a delivery address: this
can be achieved by various methods – numerical, written postcode first, by
county. Address
Lookup can get your address within 8 characters on average and will
be verified by Royal Mail and numerous other data address companies, in the
correct format for your database; all within nanoseconds!

Mobile Verification

Imput mistakes on Mobile numbers are common, and without the right checks, could result in the difference between a great review and a poor one, as well as a huge waste of resource and money. Imagine how easy it is to enter the wrong number or add an extra digit. Without the correct checks, this could mean the customer not receiving the text message from the courier or a text purchase confirmation. A phone call of complaint to the customer service team could result in cancellation of the order, followed by the writing of a bad review. All of this is easily preventable. Did you know that you can even see which country the mobile is in. Have a friend who is abroad at the moment? Give it a try!

Data Cleansing & Deduplication

So, now that you have won over the customer, who has become a regular shopper, buying from you online, through the shop and even concessions, you could be entering their data in different ways.

Let’s take Margaret Ward. She
could be recorded as Maggie Marge, Marrgaret and Ward could be Word, Warde or
Warrd. Even the address could be entered as 9, nine, nein! It all depends on
who entered it and what mood they were in. Of course, you could use the address
check on your EPOS or Customer Service screens, but there is still the chance
of errors.

This is where data cleansing and deduplication comes into play. Marrying up the data correctly can ensure that you are not sending out numerous brochures and mailings to the same person, which not only shows that you don’t know who Margaret is, but is a huge waste of money and could have an impact on your bottom line.

Data is here to stay
and can only grow bigger. By getting these elements right, it will help you
present the next fashion trend to your customer faster and correctly but, will
also help improve your overall purchasing and data cycle.
Find
out more at data8
for more information

We are pleased to announce that V3 of the Data8 TPS solution is now available for you to use within your system. The update itself doesn’t come with any new features for existing customers, however, does make the configuration for either new customers or customers looking to migrate from one system to the other easier than ever before.

Previously the tool required advanced CRM users who had an understanding of complex tool such as Plugin Registration Tool, however, this has now all been removed and all configuration can be made directly into the Dynamics platform now without any other tools.

Configuration for TPS and CTPS checking in your system can now be done in minutes and with that includes instant feedback on screen if a number is on the TPS or CTPS, a flag saved against the record explaining if the record is callable or not for use in things like advanced finds and also an overnight cleanse to ensure your data stays compliant moving forward.

If you would like to try the TPS solution in your Dynamics 365 for CE system, please get in touch.

The latest blog from Ryan and his learning. The aim of this post is to show my journey step-by-step, taking my prior creation of a C# MVC web application with an email validation check, and using the learned knowledge to create a front-end (JavaScript and HTML) web application with an email validation check. Again, front-end development was something that I had very little experience with before this project.

This project was my second task at Data-8, and I completed
it within my second week of starting at the company.

My first step to implementing a web service through the
front-end was to create an ASP.NET Web Application project in Visual Studio.

The only page I would need for front-end implementation
would be a view, as this is where both JavaScript and HTML are written.

In solution explorer, I opened the Views folder > Home > Index

I cleared out the existing ‘startup’ HTML and added the following code:

This created a basic form with input fields to gather
information about a Book Name, Email Address and Contact Name. These would then
be used later on in the validation checks to collect user data to validate. The
alert div element is currently hidden (style=”display:none;”) but can be shown
later on in code by calling .show(); I used this to output the result of
validation checks in a nice way.

The next thing I wanted to implement was to perform
validation checks on the entered data as soon as the user clicked off a field.
This instant feedback would be useful as it catches errors in entered data
before the user submits the form.

After researching how to trigger an event when the user
clicks off a box, I came across the ‘onblur’ event. This was implemented as an
attribute to a HTML element and pointed to a function that should be called
when the user clicks off that HTML element.

I created a function called ‘OnFocusChanged’ which took two
parameters, an ID and a value of the entered data. This would populate the
hidden alert message every time the user clicks off a field in the form without
entering any data. It generates a message based off the ID of the element that
was active, and outputs it to the hidden alert box. Notice the ‘.show()’ at the
end of the jQuery line that reveals the html element to the user.

I also added the ‘onblur’ code into the HTML elements and
passed in the ‘OnFocusChanged()’ function to be called when the user clicks off
the field. I gave the function parameters of a string to identify the HTML element
and the data that was entered.

I then built the program which displayed the three input
fields and a submit button. Clicking into a field and then clicking off it
should generate an error message for the respective field.

So far, I had created a program that ensures that the user has
entered data before clicking the ‘Submit’ button. However, it doesn’t use any
web services to check the validity of the entered data, only that some entered
data exists.

The next step was to implement a simple validation check in
the ‘OnSubmitClicked’ function to check that the required data is not empty,
before I pass it into a web service call. I used jQuery to access each required
field’s value, performed a check if the value was empty, and if so, outputted a
message to inform the user that they had to enter data in that field. It also
returns, which stops the function from performing any attempt at validation
checks.

Once the data had passed the null/empty checks, it needed to
be passed into a web service call to validate the entered data. To call the
webservice, I created an Ajax request which passed data in a formatted structure,
to an API and then received formatted data in return as a result of the check.

For this project, I wanted to use the Email Validation web service by Data-8 to check the validity of the entered email address.

This ajax request would generate a message upon a successful ajax request. The message is generated from a switch function which checks the result of the validation check.

When attempting to implement the web service, I found I was
getting errors from the way in which I was passing data into the call. I
researched further into the webservice and found several pages of documentation
which helped me understand the requirements.

To find out what data would be returned, or the structure of
the data passed into or returned from the web service call, I had to access the
web services documentation for the email validation web service and digest the
XML code to find the structure of the data.

On the ‘Email Validation’ service documentation page, I saw a list of all the email validation checks that I could call. I chose ‘IsValid’ which was the function I would be calling in my project.

I was then presented with several blocks of XML code. I
found the data structure for data being passed in to the ajax request in the
blue circled area under the name of the function being called.

The data structure for the data returned from the ajax
request was in the second screenshot of XML code, highlighted in red and was
inside an object called the function name followed by Result (e.g. ‘IsValidResult’).

In the ‘Level’ and ‘Result’ variables, multiple
possibilities for data can be seen representing an Enum. All possible options
are listed, but only one needed to be used at a time, and it needed to be
entered exactly as shown, including capitalisation and as a string literal.

Back to the Ajax Request and Web Service Call

Now that I had an understanding of the structure of the data
I was entering and getting back, I could see where I was going wrong. I
formatted the data before passing it as a ‘JSON String’ (with
‘JSON.Stringify()’) into the function. The entered data needed a name as a string,
and a value.

When I now built the program, entered a value in all fields,
and clicked ‘Check’, the alert banner produced a result of the validity of the
entered email address.

My program had a basic input validation for any fields marked as required, and also a validation check on the entered email to check it was valid. This was another extremely useful project that taught me a lot about front-end development including jQuery, HTML coding, and Ajax Requests, and how Data-8’s web services worked and how to know what data is required for the validation check.

For more help, just check out our section “How to” on the website. Thank you for reading and if you have any questions, please don’t hesitate to ask. Ryan.