Wednesday, December 28, 2016

When I first started consulting over 15 years ago, it seemed like we didn't come across that much customization. Maybe it was because we didn't have a developer on staff, maybe it was the nature of the clients we served, or maybe it was just indicative of the time. But over the past 15 years, I've seen a growth in customization and integration on even the smallest of projects. I attribute this to a number of things including the growing sophistication (and therefore) expectations of clients (even on the smaller end), the release of additional development tools that decrease the effort involved, and even a changing mindset that customization can help "unleash" the potential of your software. For whatever the reason, it seems like customization at some level has become a norm of sorts.

Let me also add that when I say "customization" in this post, I am including integrations and interfaces between systems as well.

With clients we have implemented, and those we have picked up over time, the tendency seems to be to "trust the professionals" with the customizations. While I agree with this on one level, in terms of who should be doing the actual development-- I also would emphasize that every client/power user needs to understand their customizations on several levels. This due diligence on the client/power user side can help ensure that the customization...

works in a practical sense for your everyday business

is built on technology that you understand at a high level

can grow with your business

is understood in a way that can be communicated throughout the user base (and for future users)

I often find that over time, the understanding of a customization can become lost in an organization. Give it 5 years, and current users will bemoan that they don't understand...

why they have customization

what the customization does

how the customization can be adjusted for new needs

While the IT admins will bemoan a lack of understanding of how to support the customization effectively and/or perpetuate misunderstandings regarding the technology and capabilities.

None of this is anyone's fault necessarily, but it does emphasize the need for due diligence anytime you engage with a consultant or developer for a customization (and even a worthwhile endeavor to review your existing customization). What are the key parts of the due diligence I would recommend? Well, you KNEW I was going to get to that! So here you go...

Every customization you have should have a specification. It doesn't have to be fancy, but it does need to be a document that contains an explanation of the functionality of the customization as well as the technology to be used to develop it. Ideally, it should also contain contact information regarding the developer. I'll be honest, I run in to clients wanting to skip this step more than consultants and developers. I suppose this has to do with not seeing the value of this step, seeing it as a way for consultants to bill more. But this step has the greatest impact on a client's ability to understand what they are paying for, and minimizing miscommunication and missed expectations. If you don't have them for your existing customizations, ask for them to be written now (either internally or by the consultants/developers). On another side note, somewhere the actual process for using the customization should be documented as well. Sometimes this is added to the spec, sometimes if is added to other process documentation. But make sure it happens, so that the customization is included in training efforts internally.

Understand at a high level what technology is being used to develop the customization. Why do you need to know this? Well, you need to understand what is involved in upgrading the customization for new versions? How about adjusting or adding functionality? Will it mean code writing, or something simpler? How about source code? Does the customization have source code, and who owns it (in most cases not the client but the developer/company retains ownership)? What does that mean if the developer stops working for the company? Or if you change companies? Will there be a detailed technical design document to be used if other developers need to be engaged? And is the technology specialized (e.g., Dexterity) or more common (e.g., VB, .Net, etc). All are important questions that impact the longevity and flexibility of the customization.

Conduct full testing with a method for collecting feedback internally, so that you can ensure that the customization indeed meets expectations and enhances the user experience. It is not uncommon for a customization to be developed per the specification but in practice still need adjustments to make it truly useful for the users. When this happens, clients will sometimes "stall out" out of fear of additional costs. Even though, in the long run, the additional costs that might be incurred at this stage could save frustration as well as future replacement costs when the customization is abandoned. Just make sure during this point in the project, that the spec and process documentation are updated with changes.

What else would you add to the due diligence for clients and customizations? Let me know!

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a director with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Tuesday, December 27, 2016

I previously thought of source code control as just another piece of software--an application that you use that manages the versions of your code. SourceSafe, SVN, Git, Team Foundation Server, there are many options for software or services that will take care of source code control / version control for you. As long as you "use" one of those solutions, you're all set.

But today I learned a pretty significant lesson. It is probably obvious for many people, but it was a bit of a wake up call for me.

"Source code control" is not Git or Team Foundation Server or VisualSVN. Those are just tools that are just one piece of a larger process. And regardless of which tool you use or how great that tool is, if you don't have a solid, resilient process surrounding that tool, you will likely experience breakdowns.

Last week I sent out a new version of a Dynamics GP customization. The new version only had a few minor changes to enhance error handling--I added several try/catch blocks to try and track down an intermittent error that the user was seeing.

The customer tried the new version today, but they quickly noticed that a feature that was added over 3 months ago was gone. Uh oh.

While making the latest error handling changes, I noticed some oddities. The last release was version 1.30, but the projects in Visual Studio were still set as version 1.21. I checked the version 1.30 files that were released, and they were versioned properly, so something happened that caused the Visual Studio projects to revert to version 1.21. I wouldn't have done that manually.

I then checked my Documentation.cs file that I maintain on all of my projects. The last note I had was for version 1.21. No notes on version 1.30. That's not like me, as that's usually my first step when updating a project.

I then checked the Git branch of the project. Visual Studio was using branch 1.31, but it was only a local branch and hadn't been published to BitBucket. 1.30 was published, but it didn't have any notes on version 1.30 in my local repository or on BitBucket.

I checked the Commit log on BitBucket, and that left me even more puzzled. I didn't seem to have any commits acknowledging the version 1.30 release.

I see check ins for v1.2 and v1.21, and the new v1.31 release, but nothing for v1.30.

Somehow I had produced a version 1.30, with correct version numbers in Visual Studio, which produced properly versioned DLLs, which got released to the customer, but I have the following problems:

1. I either didn't update my Documentation.cs file, or it somehow got reverted to a prior release, causing my changes to be wiped

4. Despite having v1.30 and v1.31 branches in Git, I didn't see any changes when comparing them to each other, or to v1.21.

5. I can't find any evidence of a version 1.30 release in BitBucket

The only evidence I have of a version 1.30 release is the separate release folder I maintain on my workstation, where I did document it in the release notes.

And I see that the DLLs were definitely version 1.30, so I'm not completely imagining things.

So somehow, I managed to make the following mistakes:

1. Potentially reverted my code to a prior release and lost some changes

2. Didn't clearly perform a v1.30 check in, or if I did, my commit comments did not indicate the version number like I usually (almost always) do

3. Created a v1.31 branch for an unknown reason that I didn't publish and didn't document.

4. Somehow made what is likely a series of several small mistakes that resulted in the versioning problem that I'm trying to fix today.

The most frustrating part is that it isn't obvious to me how such a roll back could have happened.

And all of this despite the fact that I'm using an excellent IDE (Visual Studio), an amazing version control system (Git), and a fantastic online source code management service (BitBucket).

My problems today have nothing to do with the tools I'm using. They clearly stem from one or more breakdowns in my process. And this was just me working on a small project. Imagine the complexities and the mistakes and the issues that come up when there are 15 people working on a complex development project?

So today I learned that I have a process issue. Maybe I was tired, maybe I was distracted, but clearly I forgot to complete multiple steps in my process, or somehow managed to revert my code and wipe out the work that I did.

I now see that I need to invest in my process. I need to automate, reduce the number of steps, reduce the number of manual mistakes I can make, and make it easier for me to use the great tools that I have.

I don't know how to do that yet, but I'm pretty sure that much smarter people than I have had this same issue and come up with some good solutions.

Wednesday, December 21, 2016

Sometimes in this line of work, you get so used to doing thing one way, it take a bit of jolt to remind you that there are other ways to approach things. Sometimes that jolt comes from a coworker's comment, or a client asking a question. I like those moments, because they encourage innovative, creative thinking. And innovative, creative thinking challenges me and, honestly, makes this job a whole lot more fun!

One example of this sort of situation is with acquisitions. Specifically, situations where a company on GP is acquired but has plans to stay on GP through the transition. Typically, this means the following...

Identify closing date of acquisition

Set up a new company database

Transfer over acquired assets and balances as of the transition date

This approach works well when you are dealing with a single company. Or maybe a couple. It works because its...

Straightforward

Relative simple

Clean (the client can keep the history of the former company in the old company while the new company starts fresh)

Where this process doesn't work so well is when we starting talking about...

Lots o' companies

Lots o' data/modules/customizations/integrations

In these cases, the idea of setting up multiple brand new companies, copying data, ensuring that customizations/integrations work, can be a bit daunting in the midst of an acquisition. This is doubly true if the customizations/integrations support critical day to day business operations. Those of you that know me know that I don't believe that just because something is "daunting" that we shouldn't do it. But these "daunting" things do mean we have to approach the project with a higher level of due diligence in advance as well as project management during to mitigate risks.

So what about another option? Can we avoid setting up all these new companies? Yes, we can. It just requires a bit more creative thinking. As an alternative, we can approach it like this...

Continue forward with same companies

Backup companies at transition date to create historical companies

Remove history as needed from live companies

Enter manual closing entries as of the transition date (assuming fiscal year is not changing, and transition is not fiscal year end)

Reset assets and any other balances as needed (this can be the tricky step, involving scripts to original cost basis = net cost basis, etc to move forward)

Now, the process above does require due diligence in advance as well to make sure all transition needs are identified and planned for. But it can save effort and reduce risk in some cases. So...a solution to consider. What other creative/innovative approaches have you seen to handling acquisitions in Dynamics GP? I'd love to hear from you!

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a director with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Tuesday, December 13, 2016

During the course of a new implementation of Dynamics GP, we usually have a discussion surrounding the chart of accounts. Do you want to change it? If so, how? How well does it work for you today? And clients sometimes vary in their willingness to explore changing it. Some are open to discussion, to see how they might tweak it to better support their needs, while others are satisfied with what they use today. From time to time, we also find ourselves discussing the chart of accounts structure with clients who have been on Dynamics GP for a number of years or even decades. In those cases, the company may have grown and the reporting needs have also changed.

I thought it might be worthwhile to share some of my own discussion points when exploring the chart of accounts structure with both new and longtime Dynamics GP users. So where do I start? I always start with the desired end result...Reporting! So let's start there, and then toss in all my other typical considerations...

What are the current and desired reporting needs? How are reports divided/segmented (departmental, divisional, etc)? Are the lowest levels for reporting represented in the chart of accounts today? How about summary levels? Do the summary levels change in terms of organization over time (so maybe they shouldn't be in the chart of accounts structure)? Is there reporting and/or other tracking in Excel that should be accommodated by the chart of accounts structure so that the reporting can be automated?

What about budgeting? What level does budgeting occur at? Is that represented?

What about other analytics? Are the components available in the chart of accounts? Are there statistical variables? Are they in Dynamics GP as unit accounts?

How does payroll flow to the general ledger, does it align to the chart of accounts (e.g., departments, positions, codes, do they match up)? Is there an expectation of payroll reporting from the general ledger in terms of benefit costs, employee costs, etc? Are those levels represented in the chart of accounts?

Are your segments consistent? Does a value in department mean the same thing across all accounts? Or do you need to look at multiple segments to determine the meaning (e.g., department 10 with location 20 means something different than department 10 with location 40)? Consistency is a goal whenever possible to facilitate reporting.

How about your main accounts? Review a distinct list? Are they logical, in order, and follow the norm (e.g., expenses in the 6000s)? Is there room to add main accounts? Are there duplicated/inconsistent main accounts?

Do you do allocations? If so, how and by what factors? Can we use fixed or variable allocations to facilitate in GP? Do we have the needed components in the chart of accounts to determine what to allocate from and to? Do you want to offset the allocation in separate accounts to see the in/out of the allocation?

Anything I missed? Thoughts, comments? Please share and I will update the list!

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a director with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Monday, December 12, 2016

Here are two obscure eConnect errors that you should never encounter. Unlike my customer who did encounter them.

Error Number = 311 Stored Procedure= taPMTransactionInsert Error Description = Misc Tax Schedule ID (MSCSCHID) does not exist in the Sales/Purchse Tax Schedule Master Table – TX00102MSCSCHID = Note: This
parameter was not passed in, no value for the parameter will be returned.Error Number = 312 Stored Procedure= taPMTransactionInsert Error Description = Freight Tax Schedule ID (FRTSCHID) does not exist in the Sales/Purchases Tax Schedule Master Table – TX00102FRTSCHID = Note: This
parameter was not passed in, no value for the parameter will be returned.

Notice that the error says that the tax schedule ID does not exist, but then says that no value was passed in for the tax schedule ID.

So, if you are sending in a blank tax schedule ID value to eConnect, how can it be invalid, and thus cause this error?

As with many eConnect errors like this, the error is not caused by what you send to eConnect. It's caused by some value or configuration option buried deep in Dynamics GP, that is impossible to figure out based on the eConnect error alone.

So what does this tell us? If no tax schedules are passed into taPMTransactionInsert, eConnect tries to get default Tax Schedule IDs from the Payables Setup table, PM40100. Once it gets those Tax Schedule IDs, it validates them.

So...how could that cause the error we're seeing?

Figured it out yet?

The only way the default Tax Schedule IDs in PM40100 could cause the error would be if those default Tax Schedule IDs are INVALID!

Wait a minute. How could the default tax schedule IDs in the Payables Setup Options window be invalid, you ask? The Payables Setup Options window validates those at the field level--the window won't let you enter an invalid value or save an invalid value.

So, that leaves either a direct SQL update to set an invalid value in PM40100, or perhaps more likely, someone ran a SQL delete to remove records from TX00102. My guess is that someone figured they didn't need a bunch of pesky tax schedules, or wanted to change some tax schedule IDs, and they didn't realize that the PM40100 was also storing the tax schedule IDs.

I've asked the consultant to run this query to check the tax schedule IDs setup in PM40100.

If the tax schedules have values, but the "IDExists" fields have a value of 0, then that means there are no matching records in TX00102, and that the values are invalid.

And that is the solution to your mystery eConnect error of the week!

Steve Endow is a Microsoft MVP
for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.
He is the owner of Precipio Services, which provides Dynamics GP
integrations, customizations, and automation solutions.

Thursday, December 8, 2016

Sometimes when it rains, it pours. It seems like requests come in waives, and in the last two weeks I have had 4 separate clients ask about or implement emailing remittances. It seems like such an obvious thing, because if you are avoiding printing checks-- why wouldn't you also want to avoid printing remittances as well?

The good news is that it is super simple to set up. Assuming you want the emails to be routed directly through exchange (and not through a local email client), you first need an account to be used for the sending of the emails. And second, your exchange server needs to have the auto discover option enabled. Then it is really as simple as the following 4 steps...

1. Admin-Setup-System-System Preferences, select Exchange for the email option
2. Admin-Setup-Company-Email Message Setup, create a message ID and message for the emails
3. Admin-Setup-Company-Email Settings, set options for emails (including document type) and then click Purchasing Series to enable and specify the email message ID for remittances
4. Cards-Purchasing-Vendor, enter email addresses using the Internet Addresses (blue/green globe) for the remit to address (use the To, CC, and BCC fields as appropriate) and then enable the email remittance under the Email Settings button for the vendor

Once you have these steps completed, it is as simple as choosing to email remittance forms when you are in the Process Remittance window (Transactions-Purchasing-Process Remittance). Keep in mind, I definitely recommend doing this first with a single vendor using your own email address. As you may want to tweak the format and/or the email message.

Happy emailing!

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a director with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.