Category: Dynamics GP


One of Microsoft’s buried SQL gems.  I was working with a client with serious inter-server Dynamics GP performance issues, so slow as to be virtually unusable, but GP ran fine at the SQL Server.  We discovered SQL ODBC was communicating using Named Pipes only.

Without getting into troubleshooting detail, their IT group changed the default SQL port for Microsoft SQL 2016, and Microsoft, in all their infinite wisdom, removed port selection settings from the client ODBC DSN Configuration interface a number of versions back.  So how do you tell ODBC what port to connect to?

The answer is simple.  During configuration, when you select the server, enter the SQL Server and instance followed by a comma (,) followed by the port number it will connect with – no spaces.

For example, If your SQL Server is on CONTOSO1\GP, but needs to use TCP/IP Port 12345, enter CONTOSO1\GP,12345 in the Server line. If SQL is on the default instance, it would be CONTOSO1,12345.  It’s that simple.

This is an additional post on the perils of network performance, hardware, and file placement as related to Dynamics GP’s client/server processing algorithm.

While I could use posting, reporting, or data entry as an example, I will use SmartList as an example, since it provides a generic platform for non-technical analysis.

Let’s consider a simple action in Dynamics GP – the version is irrelevant for this scenario, other than the fact that SmartList was introduced in early versions of Great Plains Dynamics and CS+ and is commonly used.

  1. A normal system – our baseline:
    a. You request a Payables Transaction SmartList.
    b. The request is sent to the SQL Server Database Engine.
    c. Relevant data returns to the user’s temp folder in small packets (25 records each) for processing and temporary storage pending query completion.
    d. Results display to screen for further user interaction.
  2. A normal client with active antivirus scanning enabled
    a. You request a Payables Transaction SmartList
    b. The request is sent to the SQL Server Database Engine
    c. Relevant data returns
    d. Packet 1 – antivirus scans packet to ensure it is safe, then writes and stores in temp
    e. Packet 2 – antivirus engine scans packet
    f. Packet 3-400 (assuming default ‘first 1000 records’)
    g. Results display to screen
  3. A client with a low-bandwidth NIC (100Mb, Wireless)
    a. You request a Payables Transaction SmartList
    b. The request is sent to the SQL Server Database Engine
    c. Relevant data returns, but is unable to process as fast due to data backed up waiting its turn to get through. Think of the data as water, with the server 1GB+ NIC as a garden hose. With the 100MB NIC we are trying to force all the water passing through the garden hose into a drinking straw. Once the water is shut off (SQL processing finished), the hose will empty, but held back by the diameter of the straw.
    d. Packets arrive and results display on screen slowly

4. A client (or server) with slow disk drives or drives busy with other activities
a. You request a Payables Transaction SmartList
b. The request is sent to the SQL Server Database Engine
c. Query runs on server, writes to temporary tables, but both reading and writing data are slowed by the rate the server disks are able to read, write, and process the request. This is most noticeable in 5400RPM drives, but still noticeable in 7200RPM HDDs. If multiple databases are written to simultaneously, performance drops even more. That’s why data and log files, and tempdb should be on their own physical drives or drive arrays.
d. Relevant data returns, but is again hindered by a drive that cannot write fast enough to keep up with the data returned, or writing is hindered by the disk being busy writing other processes to disk, calculating spreadsheets, rendering drawings, etc.

5. Client with roaming profiles
a. You request a Payables Transaction SmartList
b. The request is sent to the SQL Server Database Engine
c. Relevant data returns to the temp folder on a different server
d. Client queries and processes data in the temp folder across the network. This will take many bundles of data, and we’re now shuffling data to another server.
e. Data returns to client and displays on screen for further processing

As you can see from the above examples, there are numerous ways in which data can be slowed down, backed up, or bottlenecked. Keep in mind that any combination of the above scenarios is also possible.

It’s also easy to alleviate many of these trouble spots.
In scenario #2, create Antivirus exceptions for known temporary files.
* In the %temp% folder:
* TNT*.dat
* TNT*.idx
* TNT*.tmp
* In a %temp%\TNT* subfolder:
* ASI*.dat
* ASI*.idx

In scenario #3, make sure the NIC is a minimum of 1GB, and any hubs (old) or switches support 1GB throughput. Most new computers already comply. If you’re connecting to your network on a wireless connection, plug in a network cable. If your notebook doesn’t have a NIC port, there are USB adapters.

In scenario #4, the faster the drive, the better the performance. A Solid-State Drive is fastest. Increased productivity will quickly offset the additional cost of an SSD.

In scenario #5, if your IT requires roaming profiles, make sure your network is as fast as possible, but ask if it’s available to store temp files locally rather than on another server.

Dynamics GP Process Server, available since the early versions of Great Plains Dynamics, has been pretty much ignored by most of us throughout the years.  Yet it can do so much to assist in many situations.

Since most GP users (and honestly, many GP consultants) have never heard of Dynamics Process Server, let’s begin by explaining its purpose.  As explained in a previous post, Dynamics GP is a Client-Server application, meaning that processing is done on both the client and in SQL Server database processing.  We insist on fast servers and great server performance, yet overlook the performance of the equally-important workstation component.  Many of the posting and printing functions are done at the workstation, and if the workstation is underpowered, has too many other applications open in the background, or even if the user needs to push on to the next task quickly but is blocked by other GP tasks already running on the client, many times these workstation-intense tasks can be sent to a Dynamics Process Server (which we will refer to as DPS going forward).

A DPS ‘server’ is nothing more than a GP client installed on a workstation or server either on a dedicated or an as-needed basis, possibly a part-time employee’s workstation, or a user who is out for the day.  Specific tasks are assigned to these servers through the setup window in GP.  The good news?  There’s nothing to install since the files are already on every GP client workstation by default.

Where DPS can really make a difference is in boosting both actual and perceived performance.  Faster clients can be used to process long-running, processor or RAM-intense tasks not only making the user’s workstation available for additional tasks immediately but distributing the workload.  As an example of this, consider a GP user who needs to run Accounts Receivable Aging, process statements, and enter Sales Orders.  Without DPS, the user would run the aging, wait until completed, process statements, wait until completed, then enter sales orders.  Even if these tasks could be run concurrently on the same workstation, RAM, CPU, and disk activity would slow further activity to a crawl, or possibly even crash the application.  With DPS in place, the user sets up the processes as usual, but rather than clicking the process button, they have the option to offload the processing to DPS.  At that point, they can proceed to the next task with no processing overhead on the workstation.

I will expand on this topic in upcoming posts.

 

The official System Requirements for Microsoft Dynamics GP 2018 document from Microsoft does not give Network requirements. So some companies may forget to ask or be tempted to think it doesn’t matter.

In fact, you must be aware that:

Dynamics GP is supported only when client and SQL databases are on the same Local Area Network (LAN).

Wide Area Networks (WAN), Virtual Private Networks (VPN), or any other variant is not supported and will cause at minimum poor performance, and at worst corrupted data.

A minimum of 1GBps LAN is required for normal GP performance.  This also means that notebook users should be discouraged from using Wi-Fi to run Dynamics GP.  CAT5 cabling is the minimum acceptable connection.

If you have a mix of speeds with LAN and WAN/VPN users, it gets even scarier.  Dynamics GP tags each transaction with a Dex Row ID and that next number is reserved until the transaction is complete.  If you have a local (fast connection) client and a remote (slow connection) client processing orders simultaneously, there is a possibility that these IDs could be assigned to the wrong transaction depending on whose transaction completes faster, which would corrupt the transaction.  That is a very simplified layman’s description but should serve to show the importance.

As faster internet speeds become more cost-effective it gets more tempting to put space between client and servers or put the SQL server in a data center, but even if your network card says 1Gbps, it is not going to achieve that speed over a lossy WAN or VPN with a likely maximum of 50-200Mbps.

  • If you are implementing Microsoft Dynamics GP for the first time, make sure you review the system AND network requirements carefully.
  • If you are already running Microsoft Dynamics GP and you are thinking of making a change to your network make sure you contact your Dynamics GP Partner to review your plans BEFORE making any changes.

Additional resources:

short article on the problem of running database applications over WAN. Terminal Services is one solution when users and the database(s) reside in different locations.  The Dynamics GP Web Client is another solution, but still requires a second collocated server for the underlying GP Client, IIS, and connection management database.

This article by Dave Musgrave is the best explanation on the subject of WAN ODBC communications.

“You do not have permission to open this file.”  This error is always, always, always in reference to insufficient permissions on a shared folder or file, usually a forms or reports dictionary.  This behavior could happen when users are forcibly disconnected from a network resource, or unsynchronized AD Catalog Server permissions are accessed causing a conflict in access rights.  We usually see it when a new GP user has not been granted sufficient rights (Read/Write/Modify) in the shared file area (usually GPShare).  This issue may also present itself as an “Unknown Dictionary Error” if it appears prior to SQL login.
The GP launch chain of events is as follows:
  1. GP opens with Dynamics.exe calling business logic dictionaries listed in the Dynamics.set file.  Some of these paths may be on a different server.  No SQL connection is attempted, requested, or required at this point.
    1. If all dictionaries are accessible and open for read/write operations,  the application launch proceeds to validate the SQL user.  If dictionaries do not have proper permissions or are inaccessible, the login fails with the “You do not have permission to open this file” error.  Note that depending on domain file sharing policies, if a terminal server still shows the user running a copy of Dynamics.exe, a domain group policy may deny the user access to a second copy of the file, which would yield the same error since access would be specifically denied.
  2. Once Dynamics.exe loads all the logic specified by the SET file, the SQL login screen opens for the user.
  3. Data is provided by the SQL Server databases, and windows are loaded from the logic dictionaries mentioned in #1 above.
  4. To finish the operations scenario, GP transactions are client/server, with GP requesting as much data as necessary to satisfy the query which is then processed on either client or server depending on the transaction.  This is done through many ‘top 25’ queries until the desired data has been received to guard against large blocks of data transmitting at one time possibly overloading the network.  This constant request for data and the validations against it are the reason that server and client must reside on the same LAN.

GP & SQL Compatibility

I replied to a GPUG question earlier today from someone wanting to use SQL 2016 with GP 2015.  It bears repeating here and applies to similar situations as well.

When Microsoft says not compatible, they mean it. It may work, but it’s not supported. If you do run into an issue with compatibility down the road, you have just burned all bridges, since you won’t likely be able to restore it later to a supported SQL version. At that point you can only hope that you’ve kept a copy of databases from the earlier SQL version and reenter everything from that point, or start over. It’s usually a deprecated function is specified in GP coding that will be the problem, and the only fix is to rewrite the software.

For years I was the “Damn the torpedoes, full speed ahead” guy, the one that could always trick the system into making it work after a client broke it, but after seeing the destruction and mayhem it can cause down the road (read potentially hundreds of consulting hours), it’s wise to trust Microsoft’s compatibility list. It’s there for a reason.

What appears to be a clear question on the surface quickly clouds in the face of reality.  Your data is not only important, it’s vital to the life of your business.  There are several reasons that I will cover below.

Database upgrades complete with no errors.

This is the first reason clients think of when testing upgrades, and this is the one we are concerned least about.  If this were a data migration from one system to another, then yes, the figures must be checked closely.  Barring damaged data, database upgrades simply add columns and procedures to accommodate new processes and features in Dynamics GP.  Your data actually remains relatively untouched during this process.  While you may think initially that all you need to be concerned about is your ‘bottom line’, that’s actually the one thing you don’t need to worry about.  Check, yes.  Concern, no.  What is more likely to have changed is the way to access that data and how it gets reported.  Bottom line on your bottom line, I cannot recall one upgrade in thousands where the figures did not tie out.

Vital business processes are not hampered.

This brings us to the second and most crucial step, yet one passed over by most clients – testing processes.  Data doesn’t do us much good if we can’t act or report on it.  Each new version of Dynamics GP brings new features, and those ‘features’ occasionally change location of an action button or drop-down from the bottom of a window to the top, or even to another window.  You need to know where changes have been made so you aren’t trying to learn while the UPS driver is waiting for a check.

Speaking of checks, the formatting, along with that of many other reports (yes, a check is a ‘report’ to the software) is also modified during upgrade processes to accommodate the changes to the tables noted above.  Some of the more frequently customer-modified reports include GL Trial Balances, SOP Invoices, Check layouts, and Purchase Orders.  With the upgrading of the formats, sometimes sort orders are changed, or other changes which may affect the looks of your invoices, checks, etc.

It is imperative that you devote time during a test upgrade to run through:

  • Daily Processes in Finance, Payables, Receivables, Payroll, Inventory
  • Weekly Procedures
  • Monthly Procedures
  • Quarterly and Year-end Procedures

That includes printing checks – to blank paper, then hold up the printed check to blank computer check stock to verify alignment.  Don’t forget to shred the printed check forms, particularly if GP is printing the signature.

Third-party and integration tasks should be checked as well.  Stand-alone programs and web interfaces should be checked for proper operation, functionality, and security.

Little or no testing can jeopardize your upgrade and result in not only lost productivity for you, but additional costs for an emergency situation that would have been uncovered and corrected in test.  Will we do an upgrade without a test?  Certainly, if you can afford the down time and loss of productivity.  The no-test scenario is usually reserved for 1-2 user systems running Payables and GL only.

Frequently an upgrade also means a new server.  This gives you a perfect chance to test the capacity and capability of the new server by loading the server as you would on a daily basis to make sure the application is as responsive as you need it to be before you’re in production and it’s too late to fix a problem without lost production time.  So don’t just have one person test the system – plan a time when everyone can test together.  If there are problems, we can diagnose and correct them before it becomes a critical issue.

CAL’s ‘What’s New’ training can be beneficial if done during in the test upgrade phase.  This will eliminate many of the navigation questions that may arise when first encountering the upgraded interface.  This training can be reinforced with additional assistance when the live upgrade is performed.

There are several reasons for backing up your Dynamics GP SQL Data and shared files, among them:

  • Business Continuity
  • Human or Computer Error Recovery
  • Historical Preservation
  • Data Health and Maintenance

 

Business Continuity or Disaster Recovery is the most comprehensive and usually requires the most planning.  You must consider how much data loss is acceptable and how long your accounting system can afford to be down.

Let’s first consider the types of SQL Recovery Modes available.  It is important to understand these simply because of the types and granularity of restores depend on them.  For our purposes we will focus on the 2 most common recovery models, Simple and Full.

Simple Recovery Model is just that – simple; the backup job creates the backup file and clears committed transactions from the log file, then truncates the log.  It cannot be restored to a point in time other than when it was created.  These can be done as often as needed, but remember, you’re backing up the entire database, so keep a close eye on drive space.

Full Recovery Model (the one we recommend) has many more options, but comes with a bit more overhead.  In full recovery mode, the full database is backed up, including data and log file.  It does not, however, truncate the log file.  There is a secondary, and more frequent, backup called the Transaction Log Backup (never could have guessed that one, right?) that does the actual truncation of the log file.  In a restore situation, you restore the full backup and any log files to get you to the point of error.  This type of backup is your best bet in the event that someone accidentally clears data or other human error situation.

For Business Continuity purposes, you ideally should have a balance of on-disk and external media on- and off-site.  Two or three days should be sufficient for on-disk backups, as you would rarely want to take your accounting system back that many days.  External media retention is up to you, but a week’s worth of backups with at least one day offsite works well for most.

Many companies want or require a year-end backup archived.  Back up the database to external media, and archive as desired.  You may want to burn these to disk as shelf life is longer and would not be affected by EMP, media deterioration or accidental erasure.  Include a copy of the current DYNAMICS database for ease of data access.  Your company may have moved on, but this data if locked in time.

Don’t let your IT tell you that they are snapshotting the server and you don’t need SQL backups.  With full recovery model, you must have both Full and Transaction Log backups to keep the database healthy and keep size and performance in check.

One additional word on Server Snapshots, particularly virtual server snapshots.  If you are running them for Disaster Recovery, DO NOT run them during the work day.  SQL interprets the snapshot scan of the live databases as a disk freeze and any transaction which may be taking place during the few seconds of snapshot scans will likely be corrupted and could require extensive repair of the database if not caught quickly.

One other common error is to have the database set to Full Recovery Model, then let other backup software do the database backups.  This is fine AS LONG AS it also is capable of running Transaction Log backups.  If you only back up the database and not the log, it never truncates and will both cause serious performance issues and finally fill up the disk at which point the database (and GP) shut down.  Know your recovery model, backup type, and frequency.  We can work with you IT staff to develop a plan that works for you.

Test your backups!  Even the best of us can fall into that trap.  The backup looks good, but if you can’t restore it, you might as well not have one.  That’s why my comment on on-disk backups above.  If you need to restore to a point-in-time, have your transaction logs backed up to disk, but your full backup is on some tape somewhere, you’ll be down for as long as it takes to recover the backup media and get it mounted.  Keep several recent backups and relevant log backups on disk for speedy recovery.

There are some non-SQL files that should be backed up regularly as well.  That includes modified reports and forms dictionaries, FRx SysData folder (if you’re still using FRx, and if you are, why?  …but that’s another topic), Integration Manager database, signature files, Mekorma Stub Library, etc..  CAL usually has these under a server ‘GPShare’ folder, so include that in your backups.

Several final notes and frequently misunderstood items:

  1. Backing up your DYNAMICS database does not back up your data. The DYNAMICS database is the GP system database, but only contains system-level information – users, registration, security, etc.  The actual transactions are in your company database.
  2. When you add a company to Dynamics GP, make sure the company database is included in the backup. I usually set the backups for ‘all user databases’ – that way, if a new company is added, it’s automatically included in the backup.  If you use ‘all user databases’, you’re also covered for Management Reporter and SQL Server Report Server backups.
  3. Check your SQL System backups – Master, Model, and MSDB. With those you can recover your GP SQL users in the event of a rare, but possible, SQL application crash.  Without them, you will have to create new users in SQL to tie back to the GP users.

A client on GP 2010 had just installed Professional Service Tools with the primary intent of running the Copy Company function to transition from test to production.  The utility was failing with the error:

Unhandled script exception:
[Microsoft][SQL Server Native Client 10.0][SQL Server]Could not find stored procedure ‘XYZ.dbo.eeCompanyCopy’.

The cause was that the end user had set up their own ODBC DSN and did not clear the check boxes for ‘Use ANSI Quoted Identifiers’.  Clearing the ANSI flags on the connector allowed the utility to run without error.

 

It should have been an easy upgrade…  Dynamics GP 9.0 to GP 11.0, but I needed to get 9.0 to latest Service Pack to run the GP 11 (2010) upgrade.  It would not install, so I turned off DEP, turned off UAC, it still failed.  I even pulled out my bag of tricks, copied the server-side folder to a workstation replacing the existing GP (after zipping the original folder).  It updated the application, but there was nothing in Utilities to allow me to update the databases.

Since Microsoft no longer supports GP 9.0, KB searches came up empty, but in researching, I came across a Blog entry referring to updating .NET 1.1 Framework to SP1 and also referenced a non-existent KB.  I downloaded and updated .NET to SP1, accepting the warning that it had compatibility issues with Server 2008, and it completed the framework install successfully.  I re-launched the GP 9 MSP, and IT RAN!

Hopefully this helps others in this situation.