Tag Archives: into

In today’s technology-driven world, connectivity is everything, from the edge to the enterprise.

So when Thingstream, a company that builds IoT connectivity platforms to deliver an intelligent network of connected ‘things’ to customers, needed a solution for helping customers get data from their connected devices in order to make effective business decisions, it turned to TIBCO.

Most of Thingstream’s customers require constant, real-time updates, oftentimes from devices enduring a number of harsh conditions. To meet this growing need, Thingstream is able to provide secure, easy-to-configure, cost-effective solutions by embedding TIBCO Flogo® and TIBCO® Messaging in its devices. “We have a lot of customers who are moving product across national borders or oceans, and they’re tracking not just position but vibration, temperature, and humidity to describe environmental conditions for their customers,” said Bruce Jackson, CTO at Thingstream.

Some customers even require devices to remain in the field for as long as a decade and they need the power to send and receive messages constantly. Flogo’s lightweight model was the perfect fit, allowing Thingstream to strike the ideal balance between processing power and battery life. “You really want something that will compile down into something that’s small and efficient but platform independent at the same time, and that’s really the goal of Flogo,” said Jackson.

With TIBCO’s assistance, Thingstream is able to help its own customers deliver on a very important promise: that products being shipped all over the world are being well-handled and will arrive safe and sound with no surprises. “You can’t sell the ability to move goods from A to B at a certain temperature with a minimum vibration and low humidity if you can’t track and measure those things for the duration of the journey,” Jackson explained.

SQL Server has had many different methods to track changes to data. There has been old-fashioned trigger-based logging, Change Data Capture, and Change Tracking. All of these features allow you to see how data has changed with varying degrees of how far you can go back historically to see the changes. However, there has never been the ability to see how the entire table looked at any given point in time. That is what Temporal Tables do – they log every change that happens to the table. When you want to query it for any specific period of time, SQL Server will do the hard work of getting the snapshot of how the data in the entire table looked at that time.

A great introduction on how to set up temporal tables in different ways with various limitations can be found here. In this article, you will learn how to set up versioning when creating a new table and how to convert a table with an existing history to a system-versioned temporal table.

Creating a System-Versioned Temporal Table

Table versioning can be created on entirely new tables, or on existing tables. The history table, or the table where changes are logged, can be:

An entirely new, ‘anonymous’ table with no name specified in which case SQL Server creates a table and assigns a name,

A ‘default’ table with a name as desired by you,

An existing table with data that you now want to use as a history log.

To get started, create an entirely new table and version it first.

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

CREATETABLE[dbo].[Region]

(RegionIDINTIDENTITY(1,1)NOTNULL

CONSTRAINTPK_RegionPRIMARYKEYCLUSTERED,

RegionDescriptionVARCHAR(100)NULL,

StartDateTimedatetime2generatedalwaysasrowstartNOTNULL,

EndDateTimedatetime2generatedalwaysasrowendNOTNULL,

PERIODFORSYSTEM_TIME (StartDateTime,EndDateTime))

WITH(SYSTEM_VERSIONING=ON(HISTORY_TABLE=dbo.Region_History));

INSERTINTO[dbo].[Region]

(RegionDescription)

VALUES(‘North USA’)

INSERTINTO[dbo].[Region]

(RegionDescription)

VALUES

(‘South USA’),

(‘NorthEast USA’)

SELECT*FROM[dbo].[Region]

It is easy to see that the column StartDateTime is populated with current date and time in UTC, and EndDateTime is the max value that can be specified for a datetime2 data type. These are not specified in the insert statements and do not have defaults defined, they get autopopulated. Notice the syntax in the CREATETABLE statement, generated always as row start and generated always as row end.

Now take a look at what was logged in history table:

SELECT*FROM[dbo].[Region_History]

This returns nothing. This is because logging is limited to updates and deletes and does not log inserts.

Now, if you run an update and then look at the history table, you will see that the previous row has been logged. The StartDateTime and EndDateTime values specify exactly when this row was active.

UPDATE[dbo].[Region]SETRegionDescription=‘NorthEast US’

WHERERegionDescription=‘NorthEast USA’

SELECT*FROM[dbo].[Region_History]

If you look at the main table for the same row, you can see that it has a new start date that matches the date when previous version of the row was retired.

SELECT*FROM[dbo].[Region]

Deleting a row also works similarly. When the end date of the deleted row matches date when it was deleted and there is no matching row in the main table.

DELETEFROM[dbo].[Region]WHERERegionID=3

SELECT*FROM[dbo].[Region_History]WHERERegionID=3

Adding Versioning an Existing Table

The next scenario is to transition an existing history table to versioning. Trigger-based change tracking is still a very common and easy-to-implement process used at many places. This example explores a simple way this was implemented and how to use the same table, without changing or deleting any data to implement versioning.

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

IF(EXISTS(SELECT*

FROMINFORMATION_SCHEMA.TABLES

WHERETABLE_SCHEMA=‘dbo’

ANDTABLE_NAME=‘Region’))

BEGIN

ALTERTABLEdbo.RegionSET(system_versioning=off)

DROPTABLEdbo.Region

DROPTABLEdbo.Region_History

END

go

CREATETABLE[dbo].[Region](

[RegionID][int]IDENTITY(1,1)NOTNULL,

[RegionDescription][varchar](100)NULL,

[CreateUser][nvarchar](100)NOTNULLdefault

(COALESCE(SUSER_NAME(SUSER_ID()),USER_NAME())),

[CreateDate]DateTimeNOTNULLdefaultgetdate(),

[UpdateUser][nvarchar](100)NOTNULLdefault

(COALESCE(SUSER_NAME(SUSER_ID()),USER_NAME())),

[UpdateDate]DateTimeNOTNULLdefaultgetdate()

CONSTRAINT[PK_Region]PRIMARYKEYCLUSTERED

(

[RegionID]ASC

)WITH(PAD_INDEX=OFF,STATISTICS_NORECOMPUTE=OFF,

IGNORE_DUP_KEY=OFF,ALLOW_ROW_LOCKS=ON,

ALLOW_PAGE_LOCKS=ON))

CREATETABLE[dbo].[Region_History](

[RegionHistoryID][int]IDENTITY(1,1)NOTNULL,

[RegionID][int]NOTNULL,

[RegionDescription][varchar](100)NULL,

[CreateUser][nvarchar](100)NOTNULL,

[CreateDate]DateTimeNOTNULL,

[UpdateUser][nvarchar](100)NOTNULL

default(COALESCE(SUSER_NAME(SUSER_ID()),USER_NAME())),

[UpdateDate]DateTimeNOTNULLdefaultgetdate()

)

GO

There are two simple triggers, one for updates and one for deletes, to track changes to the table.

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

CREATETRIGGER[dbo].[Region_Update]on[dbo].[Region]

AFTERUPDATE

AS

BEGIN

INSERTINTOdbo.Region_History

(RegionId,RegionDescription,CreateUser,CreateDate,

UpdateUser,UpdateDate)

SELECTi.RegionId,i.RegionDescription,i.CreateUser,i.CreateDate,

SUSER_SNAME(),getdate()

fromdbo.Regionr

innerjoininsertedionr.RegionID=i.RegionID

END

GO

CREATETRIGGER[dbo].[Region_Delete]

ON[dbo].[Region]

AFTERDELETE

AS

INSERTINTO[dbo].[Region_History]

([RegionID],[RegionDescription],[CreateUser],

[CreateDate],UpdateUser,UpdateDate)

SELECT[RegionID],[RegionDescription],[CreateUser],[CreateDate],

SUSER_SNAME(),getdate()FROMDELETED

GO

–Now insert data into the main table.

INSERTINTO[dbo].[Region]

(RegionDescription)

values

(‘Northeast’)

,(‘Southwest’)

,(‘West’)

,(‘Southeast’)

,(‘Midwest’);

SELECT*FROM[dbo].[Region]

Intentionally change the same records several times so that the history table has a decent volume of data. This script will take about 10 minutes to run as you are recreating a history table with several updates with different timestamps on them.

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

DECLARE@counterINT

SELECT@COUNTER=100

WHILE@counter>0

BEGIN

UPDATE[dbo].[Region]

SETRegionDescription=‘NorthEast’

WHERERegionDescription=‘Northeast’

WAITFORDELAY’00:00:01′

UPDATE[dbo].[Region]

SETRegionDescription=‘Southwest ‘

WHERERegionDescription=‘Southwest’

WAITFORDELAY’00:00:01′

UPDATE[dbo].[Region]

SETRegionDescription=‘Southeast ‘

WHERERegionDescription=‘Southeast’

WAITFORDELAY’00:00:01′

UPDATE[dbo].[Region]

SETRegionDescription=‘Midwest ‘

WHERERegionDescription=‘Midwest’

WAITFORDELAY’00:00:01′

UPDATE[dbo].[Region]

SETRegionDescription=‘MidWest’

WHERERegionDescription=‘Midwest ‘

WAITFORDELAY’00:00:01′

UPDATE[dbo].[Region]

SETRegionDescription=‘SouthWest’

WHERERegionDescription=‘Southwest ‘

WAITFORDELAY’00:00:01′

UPDATE[dbo].[Region]

SETRegionDescription=‘SouthEast’

WHERERegionDescription=‘Southeast ‘

SELECT@counter=@counter-1

END

Also, delete a couple of records from the main table.

DELETEFROM[dbo].[Region]WHERERegionDescription=‘West’

DELETEFROM[dbo].[Region]WHERERegionDescription=‘MidWest’

SELECT*FROMdbo.Region

You’ll see 702 rows in the history table.

SELECT*FROMdbo.Region_History

The goal is to transition these two tables to temporal tables by keeping this data intact and allowing for traditional querying as well as querying using temporal table methodology.

As a first step, add start and end dates to both tables:

ALTERTABLEdbo.Region

ADD[StartDate][datetime2]NOTNULLDEFAULT(getdate()),

[EndDate][datetime2]NOTNULL

DEFAULT(convert(datetime2,‘9999-12-31 23:59:59.9999999′))

ALTERTABLEdbo.Region_History

ADD[StartDate][datetime2]NOTNULLDEFAULT(getdate()),

[EndDate][datetime2]NOTNULL

DEFAULT(convert(datetime2,‘9999-12-31 23:59:59.9999999′))

The structures of the history table and main table must be identical for turning versioning on. Since there is one column, regionhistoryid, in the history table that is not in the main table, you can either get rid of it in the history table or add it to the main table. Getting rid of it will mean the history table has no key per the older method. This is not ideal if you want to query older data with that method. Instead, add it to the main table. You won’t be using it, just add it to ensure conformance for this purpose.

ALTERTABLE[dbo].[Region]ADDRegionHistoryIdint;

The next step is to add the period to connect two new fields in the main table and then attempt to enable versioning.

ALTERTABLEdbo.Region

ADDPERIODFORSYSTEM_TIME ([StartDate],[EndDate])

ALTERTABLEdbo.Region

SET(SYSTEM_VERSIONING=ON(HISTORY_TABLE=dbo.Region_History,

DATA_CONSISTENCY_CHECK=ON))

This returns an error as shown below:

SQL Server will not allow identity columns in a history table. The identity property must be removed, but the data in this column is needed. To solve this, create another column, move data there, drop this column and rename the new column to the old name.

ALTERTABLE[dbo].[Region_History]ADDRegionHistIdint;

GO

UPDATE[dbo].[region_history]SETregionhistid=regionhistoryid;

GO

ALTERTABLE[dbo].[region_history]DROPCOLUMNregionhistoryid;

GO

EXECsp_RENAME‘dbo.region_history.RegionHistid’

,‘RegionHistoryID’,‘COLUMN’;

GO

Now that the identity column is removed from the history table, try to turn versioning on again. This time you’ll get another error.

ALTERTABLEdbo.Region

SET(SYSTEM_VERSIONING=ON(HISTORY_TABLE=dbo.Region_History,

DATA_CONSISTENCY_CHECK=ON))

The data consistency check runs DBCCCHECKCONSTRAINT under the hood and comes up with issues if constraints do not validate. The default value of the new EndDate column is the maximum date of the system which, of course, is in the future.

There are several ways to resolve this problem. One way is to enable versioning but to skip the checks. Don’t run this code, but here it is for your reference:

ALTERTABLEdbo.Region

SET(SYSTEM_VERSIONING=ON(HISTORY_TABLE=dbo.Region_History,

DATA_CONSISTENCY_CHECK=OFF))

This essentially means the data that is currently in the history table cannot be queried on with methods used to query temporal tables. You may also run into issues with querying any new data because older data is bad and has time period overlaps. This method is not recommended since it carries a lot of risk.

Instead, it is better to fix the data for time periods to match what is expected when using temporal table methodologies. Each history record must have a start and end date in the past during which the row was valid. The start date of each history record must match the end date of the one before it and the end date should be the start date of the next one and so on. The start date of the main table record should equal the last end date of its history. Cleaning up the data in this way will ensure that there are no time gaps.

To perform the cleanup, follow these three steps:

Step 1: Find the first history record for each row in the main table and set start date to equal the create date and end date to equal update date.

UPDATEdbo.region_historySETstartdate=createdate,

enddate=updatedate

–select a.regionid,a.regionhistoryid,b.slno

FROMdbo.region_historyaINNERJOIN

(SELECTregionid,regionhistoryid,

RANK()OVER(PARTITIONBYregionidORDERBYregionhistoryid)ASslno

FROMdbo.region_history)b

ONa.regionid=b.regionidANDa.regionhistoryid=b.regionhistoryid

ANDb.slno=1

Step 2: Find the records that are dated after the first one, and update them in sequence, the start date of each record should be equal to end date of the previous one.

1

2

3

4

5

6

7

8

9

10

11

UPDATEdbo.region_historySETstartdate=b.priorupdatedate,

enddate=a.updatedate

–select a.*,b.priorupdatedate, b.slno

FROMdbo.region_historyaINNERJOIN

(SELECTregionid,regionhistoryid,updatedate,

LAG(updatedate)OVER(PARTITIONBYRegionIdorderbyupdatedate)

ASpriorupdatedate,

RANK()OVER(PARTITIONBYregionidORDERBYregionhistoryid)ASslno

FROMdbo.region_history)b

ONa.regionid=b.regionidANDa.regionhistoryid=b.regionhistoryid

ANDb.slno>1andb.priorupdatedateISNOTNULL

Step 3: The last date of the very last history record should equal the start date in main table of the same record. Remember that you have old triggers still enabled, so any changes you make to the main table will be logged again. So first, you have to drop those triggers. You also have to temporarily remove the period.

DROPTRIGGER[dbo].[Region_Delete]

DROPTRIGGER[dbo].[Region_Update]

ALTERTABLEdbo.regionDROPPERIODFORsystem_time;

Then, run an update to bridge the history on the history table and main table.

1

2

3

4

5

6

7

8

9

10

WITHRegionCTEAS

(

SELECTRegionID,maxupdatedate=MAX(updatedate)

FROMdbo.Region_HistoryGROUPBYregionid

)

UPDATEdbo.regionSETstartdate=b.maxupdatedate,

enddate=‘9999-12-31 23:59:59.9999999′

–select a.*,b.priorstartdate

FROMdbo.regionaINNERJOINRegionCTEb

ONa.regionid=b.regionid

You may, if you choose to, drop the columns createuser, createdate, updatedate, updateuser, and regionhistoryid from both tables at this point. If you have older queries using these columns, though, this might not be desirable to do.

Now, when you add the period back in and set versioning on, it works like a charm. You have also covered all time gaps involved so that querying using both the older method and the new method in versioning will work the same.

ALTERTABLEdbo.Region

ADDPERIODFORSYSTEM_TIME ([StartDate],[EndDate])

ALTERTABLEdbo.Region

SET(SYSTEM_VERSIONING=ON(HISTORY_TABLE=dbo.Region_History,

DATA_CONSISTENCY_CHECK=ON))

SELECT*FROMdbo.Region

FORSYSTEM_TIMEASOF‘2019-03-10 14:07:29.2366667′;

Gives results as below:

1

2

3

4

5

6

7

8

9

10

DECLARE@ADayAgodatetime2

SET@ADayAgo=DATEADD (day,-2,getdate())

/*Comparison between two points in time for subset of rows*/

SELECTD_1_Ago.[RegionID],D.[RegionID],

D_1_Ago.[RegionDescription],D.RegionDescription,

D_1_Ago.[StartDate],D.[StartDate],

D_1_Ago.[EndDate],D.[EndDate]

FROM[dbo].[Region]FORSYSTEM_TIMEASOF@ADayAgoASD_1_Ago

JOIN[dbo].[Region]ASDOND_1_Ago.[RegionID]=[D].[RegionID]

ANDD_1_Ago.[RegionID]BETWEEN1and4;

(The results returned may depend on when you run this query – in comparison to when the data was created, so use the right date for the variable @Adayago).

Converting Your Data

Cleaning up the data to make it conform to the system-version temporal tables can be quite tricky, and your scenario may be even more complex. Here are a few things to keep in mind:

The start date should always be less than the end date in both tables.

If you have multiple history records for a single parent record in main table, the start and end dates should also be sequential in ascending order with no period overlaps.

The end date for the last row in the history table should match the start date for the active record in the parent table.

Deleting data that does not obey these conditions is also a possible solution. Since this destroys the purpose of even having an existing table converted to history, this is not recommended. Instead you could keep that table as is and use a brand-new table to store versioning history.

It is also noteworthy that table versioning does not capture who made the change. That is something you may have to do manually, if you have a need to get this information. This trigger based workaround suggested by MVP Aaron Bertrand is a good way to incorporate this.

Removing Versioning

Eventually you may have e a scenario where you need to drop the tables or remove versioning entirely. Reasons might be that the table is gathering too much history or the footprint is not affordable.

To do this, you need set system versioning off and drop period for system_time. You can remove the date columns too since they are not of much relevance if the table is not using them but this optional.

There are a few steps to go through for this process and the following script can come in handy.

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

DECLARE@DefaultConstraintnvarchar(200)

DECLARE@Tablenamenvarchar(200)

DECLARE@startdatecolumnnamenvarchar(200)

DECLARE@enddatecolumnnamenvarchar(200)

SELECT@Tablename=‘dbo.Region’

SELECT@startdatecolumnname=‘SysStartTime’

SELECT@enddatecolumnname=‘SysEndTime’

EXEC(‘ALTER TABLE ‘+@Tablename+‘ SET (system_versioning = off)’)

EXEC(‘ALTER TABLE ‘+@Tablename+‘ DROP PERIOD FOR system_time;’)

SELECT@DefaultConstraint=NameFROMSYS.DEFAULT_CONSTRAINTS

WHEREPARENT_OBJECT_ID=OBJECT_ID(@Tablename)

ANDPARENT_COLUMN_ID=(SELECTcolumn_idFROMsys.columns

WHERENAME=@startdatecolumnname

ANDobject_id=OBJECT_ID(@Tablename))

IF@DefaultConstraintISNOTNULL

EXEC(‘ALTER TABLE ‘+@Tablename+

‘ DROP CONSTRAINT ‘+@DefaultConstraint)

EXEC(‘ALTER TABLE ‘+@Tablename+

‘ DROP COLUMN IF EXISTS ‘+@startdatecolumnname)

SELECT@DefaultConstraint=NameFROMSYS.DEFAULT_CONSTRAINTS

WHEREPARENT_OBJECT_ID=OBJECT_ID(@Tablename)

ANDPARENT_COLUMN_ID=(SELECTcolumn_idFROMsys.columns

WHERENAME=@enddatecolumnname

ANDobject_id=OBJECT_ID(@Tablename))

IF@DefaultConstraintISNOTNULL

EXEC(‘ALTER TABLE ‘+@Tablename+

‘ DROP CONSTRAINT ‘+@DefaultConstraint)

EXEC(‘ALTER TABLE ‘+@Tablename+

‘ DROP COLUMN IF EXISTS ‘+@enddatecolumnname)

Summary

Data Versioning is complex and there are no two ways of versioning that work exactly the same. There are many situations you may run into if you are transitioning from an older method to temporal tables. Knowing what SQL Server expects would help this transition to happen smoothly. Temporal tables are a great feature and very easy to use, once we cross the hurdle of setting them up.

Machine learning algorithms are crafty sons of guns. They’ve not only composed memorable (if not entirely coherent) holiday tunes and written lyrics, but produced artwork that’s convincingly humanlike. Now, Janelle Shane, an artificial intelligence (AI) research scientist and author of the blog AI Weirdness, has tapped them to undertake a task of greater import: naming fireworks.

In a blog post this week spotted by Ars Technica — the most recent in a series, following posts about AI systems that have learned to name cookies and Pokémon — Shane detailed a neural network trained on thousands of names from vuurwerkbieb.nl. Said names — 3,000 in all — were collated in a database she received from a reader in the Netherlands.

Shane chose textgenrnn, an open source module on top of Google’s TensorFlow machine learning framework and the high-level neural networks API Keras, as her algorithm of choice. And after feeding it the aforementioned corpa, she got it to generate names ranging from the plausible to the laughably absurd.

As Shane explains: “Without any direction from me, and without any prior knowledge of the words and what they mean (or even that they ARE words; they might be knitting patterns), the neural network … produces its best imitation.”

Here’s a partial list:

Happiness Rockets

Gold Mini wow

Flaming Thundersplont Box

Crackling Shapes

Bright Blue Stoppers

Soundd Box

Red flashing cake

Machine Blinking Display

Code Rumble

Smile Rockets

Extreme maxied whistling

None Star Thunder

Excellent Rain

What Crackling 16

Silver Pirate Counters

Pronk XL

B2B Torch All-Dang

Shark Whistler

1 Darner Box

Far Exploring Palm

Black Moo

Sheeperstrike

Senior Fountain

Just Timmy

Hurricane Said Bang

Shaming Bulls

Big Doo Cake

Spent on the Bun

Stragfalmed

Medium mushing box

Green Event Shorts

I just looked

Definite Box

Into Eggs

Oh Fountain

Event Badger One Assortiment

Sneaking Veet Box 3

Black Hall is released

Golden Strink is 4x

Original Cat Pix Budget 2 Boom

“In this case, its best imitation of fireworks sounds a lot like, well, an artificial intelligence trying to imitate fireworks names,” she wrote. “Some of its fireworks sound more like they’re from the post-human era where machines dimly remember what humans were like, and celebrate them with commemorative explosions.”

Shane’s reserved the full list of AI-generated firework names for her email list, which you’ll find here. She warns that some of them might be considered “PG-13.”

The holidays are here, and we’re in the homestretch before everyone turns their full attention on 2019. You’ve probably been pushing holiday promotions and seasonal offerings in some form for every stage of the customer journey, but these marketing campaigns are coming to a close and it’s time to think forward. Before jumping right into 2019 nurture programs and messaging, let your business press pause on the promotions, offers, and value-packed opportunities to simply recognize and appreciate this holiday season. It’s a way of staying top-of-mind with your customers while smoothly transitioning from holiday-centric content to gearing up for an exciting new year.

Wrap and put a bow on your holiday messaging during this next week with a non-promotional email to your customers. It’s a way of letting them you know you’re thinking of them during the holidays, and that you recognize and appreciate the opportunity to celebrate this special time of year together – even if it’s just in the form of an email.

7 Non-Promotional Holiday Email Ideas

1. A sincere letter from the executive team or CEO. Your business wouldn’t be where it is today without your customers and they’ll appreciate hearing it on a personal level from the top.

2. If they currently have products with your business that are great to use during this hectic time of year, it’s worth a friendly reminder! A financial institution, for instance, may want to let its cardholders know that this month’s cash-back rewards are in categories that pertain to their needs around the holidays, like groceries (a reminder just before Christmas dinner shopping) or department stores (for last-minute gifts). It’s not promotional if they already have the product, and serves as a reminder of how it can best benefit your customers this time of year.

3. A message on how to stay safe during this holiday season, especially as it relates to your industry. This can be in the form of a list, infographic, or even a short video. Insurance firms are a great example with how they can share safety tips around the home for large gatherings and incremental weather. Not in the business of much safety hazard during this time of year? Give it a cheeky spin with how to unplug from our devices and even work to enjoy the holiday.

4. Share how your business gave back to its community this holiday season. Perhaps your local offices, branches, or agencies hosted a food drive, volunteered as a team, made a charitable donation, or contributed to families in need – your customers want to know that they’re supporting businesses that give back to their communities. Briefly recap with a few photos why you chose this type of charitable work, the experience, and the outcome. Be humble and appreciative to your customers for helping make this happen; help them feel part of how the business gave back.

5. Reflect on the past year for the business, community, and customers. This is a great opportunity to share your 2018 successes and, what’s more, extend those successes to your customers. Keep the reflections somewhat high-level and light; now’s not the time to crunch numbers and create charts. What do your customers care to hear about, and how did they help you achieve it? What accomplishments instill trust in your brand?

6. Shine light on a customer who has made an impact on their community. It’s not about how successful they’ve been since implementing your product or service; this is all about celebrating their positive contribution and sharing the holiday spirit. (Remember to make sure they’re okay with you sharing!)

7. Create a list of top content in your industry from this past year. Is there a particularly great thought leadership article, value-packed report, or several books that your professional network will love? Again, we’re keeping this light and non-promotional as we wrap up the holiday season, but this informational round-up serves as a good transition from 2018 to 2019.

At the end of the day, never underestimate the power of a personal note that expresses appreciation. It’s the gesture more than anything, and a reminder in their inbox that you’re thinking of them (and not their wallets).

How Toad&Co Grew into a Leader in Sustainable Clothing without Growing Resources

Posted by Kendall Fisher, Executive Producer and Host of The NetSuite Podcast

Toad&Co is an outdoor clothing and lifestyle brand that’s set out to make a difference in the world by helping the planet and helping people.

The apparel industry is the fourth largest polluter of air and water in the world, and Toad&Co has become an industry leader in changing that. Since its genesis in 1995, the company has been focused on minimizing its impact on the planet by utilizing sustainable, eco-friendly materials through all avenues of business, from production to manufacturing to shipping. In fact, by Spring 2019, Toad&Co will release it’s very first line created 100% from eco-friendly materials.

But Toad&Co’s focus on social impact doesn’t stop there.

On average, only 35% of people with disabilities in the United States are employed—a number that drops to 9% for adults with intellectual and developmental disabilities. So, in 1997, Toad&Co set out to help close that gap. The company teamed up with Search, Inc. to form Planet Access Company, a logistics warehouse that provides career opportunities for adults with developmental disabilities.

[embedded content]

Typically, a company with so many important branches to its business would seek multiple systems and teams to make it all work. Even Toad&Co’s VP of Sales and Retail, Scott Whipp, admits the company’s business managers could have easily “tripled time and effort” just to stay afloat.

But they didn’t—thanks to the right processes and systems.

Watch the video above to find out how NetSuite is helping Toad&Co boost efficiencies, and thereby, allowing the company to focus on its important mission.

From technology innovation to the workplace, the business landscape has been evolving rapidly, and companies now are tasked with adapting to fast change in a world of ongoing digital transformation.

However, there is one element that will remain a constant requirement for success: meeting the needs of customers and delivering a quality experience.

Nurturing customer relationships always has and always will be fundamental for company success. However, as the world becomes increasingly connected and more communications become instant, it arguably is more vital now than ever before.

Platforms such as social media and review sites offer customers the ability to voice their experiences to a global audience anywhere and at any time.

The Power of the Customer

While these global online channels pose huge advantages for brands receiving positive messages from customers about their service, they can prove hugely detrimental for those recognized for delivering a poor experience.

With messages now able to reach thousands of people around the world within moments, consumers can be some of the most influential brand ambassadors. Accounts of their experiences can reach a wealth of potential new customers and often can influence the decision-making process of others.

Around half of shopperswould try a new brand based on reviews (57 percent) or word of mouth (46 percent) alone, recent research suggests.

What’s more, consumer loyalty inevitably has been depreciating at the hands of technology and greater demand for fast and convenient exchanges. All too often, we see brands — particularly across the retail sector — face significant backlash from customers unhappy with the level of service and communication provided.

Companies that neglect the development of customer relationships at the very early stages risk losing customers before connections can be fully established.

Taking Action on Customer Insights

With competition growing in almost every industry worldwide, businesses need to take action to understand the standpoint of their customers better, and to use the insights gathered from those relationships to adapt and enhance the delivery of their future services.

Those on the receiving end ultimately are very well placed to expose common pain points and weaknesses in a product or service, offering businesses deeper insights into trends influencing the behavior and needs of those they ultimately want to reach.

Enabling greater interactions between staff and consumers can be extremely valuable in finding new ways to improve a business. However, decision makers must ensure they have the tools in place to give customers a voice, and then use them to evaluate their business and drive it forward.

Good reporting on customer relationships is therefore pivotal to a company’s performance strategy. As technology progresses, so should a company’s ability to drive success and efficiency through better customer insights.

Many successful businesses no longer view customer relationship management software as a way to gather customer information. They now see it as a productivity tool — a way to connect with customers intelligently — and, more crucially, a means to achieving better customer service success.

The value that these systems provide to businesses around the world is evident. CRM became the largest software market in 2017, according to Gartner, and it is tipped to see a further 16 percent growth by the end of this year.

The insights captured and stored within these tools are key to building a 360-degree view of customers in an ever growing pool of potential targets. However, the onboarding process of software-based solutions can prove challenging, particularly when it comes to traditional sales reps who often view these systems as tedious and unnecessary.

However, even the most diehard “pen and paper” sales people can be converted with the right piece of technology — one that helps them simplify their everyday tasks and improve performance.

By providing employees with the solutions needed to establish and foster customer relationships in a fast-paced environment, decision makers can ensure that they lead from the front to help turn customer insights into intelligence, and to enhance their service delivery and kickstart business growth.

Overcoming the Hurdles

Although it is clear that a greater investment in listening and learning from customers is important for success in the digital age, there are still some challenges that can arise when companies attempt to harness and apply this intelligence most effectively.

For example, with companies able to reach customers across a range of platforms and touchpoints, the pool of data that can be collected and analyzed has been growing continuously, and it takes many different forms.

Once staff are instructed to use customer experiences to inform their growth strategies, many companies can dedicate huge amounts of time and resource to analyzing this data, narrowing the window to develop an actionable strategy for improvement.

Using solutions to make this process as efficient as possible will ensure that businesses can maximize the opportunity to turn customer experiences into new opportunities for growth and improvement.

The prevalence of sophisticated analytics tools, which are able to track and review various metrics of customer engagement — such as rate, duration, and sentiment across a range of digital channels and CRM systems — now offer companies a means of automating the entire process.

Companies can gather and measure their interactions with current and potential customers quickly and effectively, feeding them back into their business to fuel their sales approach.

As technology continues to evolve, and smarter ways to analyze data arise, companies should aim to digitize the management of customer relationships and open their businesses up to the power they can realize from improved services.

By treating these relationships as valuable business assets, decision makers not only can sharpen the focus of employees, enabling them to make better customer connections, but also can build an effective course of business intelligence to enhance performance in the near future and beyond.

The oft-repeated phrase “CFO’s must become more strategic” that has permeated finance stories for what seems like decades may now be becoming a reality.

And it’s not just the position of the CFO itself. Throughout the finance department, people are taking a more strategic role in the business.

One clear signal of the shift is the reporting hierarchy in today’s businesses. According to McKinsey, on average, five functions other than finance now report to the CFO. These include risk, regulatory compliance and M&A functions, as well as IT and digitalisation. And almost four in ten CFOs said they spent the majority of their time over the past year on strategic leadership, organisational transformation and performance management.

It’s not just regulatory pressures forcing CFOs into taking a strategic role either. Technologies such as automation, AI, and cloud-based planning and reporting tools are giving the finance team the opportunity to take on decision-making. Real-time reporting and easier, faster financial close, two historically manual and cumbersome processes are now becoming increasingly automated. And, these tools, that once may have been under the purview of IT, are being controlled and managed by the finance department, leading to a more strategic leadership role.

Of course, accountants have been working with business units to provide support and analysis to improve operational and management performance for decades. But now the business case for finance to assume even more of a commercial and strategic role is being backed by research. According to a PwC study, organisations that spend considerably more time on business partnering (23 percent) outperform their counterparts. And that applies across all industries, maturity level and size of organisation.

The study also found a link between the best performing financial teams and those open to new and changing technological developments, such as the use of smart data analysis tools.

Bringing insight and analysis from the CFO and finance team is particularly critical for startups and high-growth organisations, where a clear understanding of the financial implications of strategic and operational decisions is vital to the success and continued expansion of the company. With a clear understanding of finances and the potential impact of decisions, finance can often act as a ‘level head’ in what is often a highly dynamic and stressful environment.

For example, with traditional finance tasks such as planning, budgeting and forecasting, real-time data and cloud-based reporting tools can help organisations stay agile and embrace a continuous forecasting model that can react to volatile and fast-changing market conditions. In fact, a KPMG and ACCA survey found that more than three-quarters (77 percent) of executives believe the planning, budgeting and forecasting process must be a partnership-based approach driven jointly by the business and finance and that takes into account enterprise-wide risks. Indeed, budgeting was once a tick-box enterprise, often done just once a year. As it’s become more strategic and ongoing, other departments are forced to work more closely with finance, leading to other opportunities for the finance department to provide strategic value. And, as the cloud has given even small companies access to sophisticated planning and budgeting tools, more companies are seeing the value.

It’s an approach BUPA UK, a health insurance provider is taking.

“I believe finance is a commercial function as well as transactional. Finance has the key to good data and information and as such has a responsibility to provide insight, direction and options,” General Manager Wayne Close said in an interview with Financial Director. “A strong, commercially-led finance team is an incredible asset to a business and its CEO. Some of the important characteristics we seek are a high quality of data and the ability of the team to analyse that data to gain powerful insights.”

This evolution to become a true business partner also has implications for the future skills of the CFO and the finance department. As traditional accounting tasks are automated, the CFO and finance department will need to leverage new skills. There will be greater emphasis on data analysis and performance management, and it will be crucial for the finance team to develop strong communication and commercial skills.

The ICAEW says that as business partners with finance, organisations need to guard against the possibility that finance will get too close to the business and lose its objectivity. Finance needs to be actively involved in developing systems and processes across the organisation.

“They are independent from operations and they are the only ones, apart from the CEO, who have a comprehensive vision of the company. The role of a CFO who goes beyond being a bean counter is clearly not only to be a business partner but also to be a business challenger. This is not the easiest part of the job, but it is definitely a part of the modern CFO role.”

As technology automates many of the time-consuming processes in the finance department like verifying data, the finance team can provide deeper advisory services throughout the rest of the business.

Yet, for these relationships to function properly, there needs to be a strong foundation of reliable, timely and appropriate data. In the past, legacy systems with complex integrations and the need to reconcile data in spreadsheets have held these initiatives back. Modern, cloud-based finance and ERP systems now provide the proper foundation.

I can’t work out how to convert a Chinese character into a series of lines (I realise not all strokes in a character are straight but can they be approximated by one). I’ve tried rasterizing the character and using ImageLines to pick out the strokes but couldn’t get it to work.

Once I have the data points for the end points of each line in the character I plan to get mathematica to search through linear transformations to find suitable transform between the end points of the lines to the coordinates of the cities and minimize the error.

So my question is how can I convert a Chinese character into an list of lines?