Category: Information Technology


It should have been an easy upgrade…  Dynamics GP 9.0 to GP 11.0, but I needed to get 9.0 to latest Service Pack to run the GP 11 (2010) upgrade.  It would not install, so I turned off DEP, turned off UAC, it still failed.  I even pulled out my bag of tricks, copied the server-side folder to a workstation replacing the existing GP (after zipping the original folder).  It updated the application, but there was nothing in Utilities to allow me to update the databases.

Since Microsoft no longer supports GP 9.0, KB searches came up empty, but in researching, I came across a Blog entry referring to updating .NET 1.1 Framework to SP1 and also referenced a non-existent KB.  I downloaded and updated .NET to SP1, accepting the warning that it had compatibility issues with Server 2008, and it completed the framework install successfully.  I re-launched the GP 9 MSP, and IT RAN!

Hopefully this helps others in this situation.

I have been working with Windows 8 since its (legal) RTM availability on MSDN, and unless Microsoft *SIGNIFICANTLY* changes the interface for mouse/keyboard-input business PC users before its public release, it will die faster than Vista. DO NOT encourage anyone to wait for this latest OS – it’s a killer – in all the wrong ways. …just sayin’

There is a great option to restore the ‘feel’ of previous Windows versions and adds some great new features – Try Winstep Xtreme  – you can download your copy here free.  I’ve been using it for over 10 years now more as ‘eye candy’ than for productivity, but now I get the best of both.

Updated 3/19/2018 to include new findings based on real-world experience, newer Operating Systems, and best practices.

I am frequently asked to condense Microsoft’s detailed System Specifications for Dynamics GP. Following is a ‘short list’ of server hardware requirements and recommendations that will give maximum performance at reasonable cost:
1. 64-bit hardware platform, Hardware RAID. You want Hardware RAID so the maintenance of the drive partitioning and striping doesn’t fall to the Operating System.  Windows Server 2012 (or later) Standard 64-bit and SQL Server 2012 (or later) 64-bit Standard Edition. 64-bit allows the use of memory over 4GB without disk swap. 1 or more quad-core processors.
2. 12GB RAM minimum.
3. Drive configuration – The recommendation is OS on RAID1 (2 drives), Data on RAID5 (3 or more drives), which is a minimum of 5 drives. One additional often-overlooked factor is drive speed. Use 10,000 RPM drives as a minimum for the data drives, 15,000 RPM preferred. If it’s a matter of size over speed, lower the size (15k RPM is not available in all drive sizes) to maintain speed. For OS, 80GB drive space minimum, for Data, (3 (or more)) 132GB in RAID5, yielding 260+GB of drive space. For maximum performance, add one additional small 15,000 RPM (or Solid State) drive and move the SQL tempdb to that drive. Isolating tempdb will boost SQL performance dramatically.
If you must back off the drive count, go for a RAID5 array of 10,000+ RPM drives in a single partition. In this configuration, partitioning data from OS has no good effect on performance, since all partitions are running on the same drives, and it is the division of tasks by array and spindle that increase performance, not just a cosmetic partitioning.
If the server is virtual, either Hyper-V or VMware architecture is supported. Allocate at least two processors and a minimum of 12GB RAM. Do not permit the host to dynamically allocate RAM. File and folder layout is not critical, however, you must use a fixed disk size rather than dynamic, and virtual disk files MUST be fully expanded at the host on creation. Calculate sufficient space for Microsoft SQL database growth, installation software, and on-disk data backups. Depending on the number of GP companies and anticipated data volume, an 80-120 GB System drive and 200GB data drive should be sufficient.

Do you find that each time you log into Management Reporter on Terminal Server or Citrix, you need to re-enter your login and default company?  The reason those are lost between sessions is that their user profiles are set up to delete temporary files and folders on exit.  This unfortunately is where Management Reporter coders chose to put the configuration files that hold that information from session to session. Making the issue more difficult is the fact that Management Reporter coding requires a SQL user login per company, so it’s not a single sign on as it was in FRx.

Constantly adding your login information can be quite frustrating.  We and a number of others have asked Microsoft to change the location of the user configuration file to a different location.  The request is pending.

However, we can tell you how to change the settings so that deleting temporary files and folders on exit is turned off.  (Note: It actually needs to be that way for GP in the event a user is posting and the connection to the Terminal Server drops.  If the session is left running on the Terminal Server, the posting will continue normally.  If the setting to delete is enforced, the session is forcibly closed resulting in a hung posting or data corruption.)

The environmental variables that hold the MR settings are %APPDATA% and %LOCALAPPDATA%.  Some GP functionality also depends on these variables.  User accounts should have the ability to read and write to these variables and they should be retained, not deleted.  On a Windows 7 workstation those variables point to c:\users\User_Name\AppData and \AppData\Local.  In a roaming profile frequently used for Terminal Server\Citrix users, these variables may reside on a completely different server, requiring cross-server security changes.

Bottom line, it’s not an easy fix.  But it can be done. And it is one of this things that will make your users very happy.

Originally posted on http://www.erpsoftwareblog.com/2012/07/management-reporter-tip-how-to-maintain-logins-and-default-company-settings-between-sessions/

There is a wealth of information about this, but I have to believe that some people just don’t get it.

Let’s start with the easiest, Simple Recovery mode.  You only have one choice for a restorable backup, and that’s a full backup.  When you back up the database, the log (committed transactions) are truncated, allowing the log file (.ldf) to maintain its smallest available size.  There is no transaction log backup for point-in-time restore, so you restore from latest backup.  SQL Express only runs in Simple Recovery mode.

In Full Recovery, you back up the complete database (full backup), but you MUST also back up the transaction log.  If you do not run periodic Transaction Log backups, the log file (.ldf) will continue to grow until it either reaches a preset limit or (more often) fills the drive.  The Transaction Log backup is what clears and truncates the log.  For restoring, you would need the last full backup plus any transaction logs up to the point of desired recovery.

Companies who use a software-specific backup agent for SQL backups are most commonly running full backups to their tape, the cloud, or other media.  If your database is set for Full recovery, you must back up the Transaction Log as well either to the same destination using the agent, or set up a SQL job to back up the log separately.  If this is the case, you would restore from full tape backup then any subsequent transaction log backups to the point of desired recovery.

I’ve seen too many cases where a company will choose a full recovery model to minimize data re-entry in the event of drive or human error, but then only runs full backups to external media.  The result?  A huge log file, or a drive that is out of space, stopping any additional data input until the condition is corrected.

Instructions for fixing full drives are available from Microsoft and many other sites, so I will not cover them here.

If you’re not certain what your database settings are for each of your databases, run this query in SQL 2005 or later:

select name, DATABASEPROPERTYEX(name, ‘Recovery’) Model from sys.databases

Bottom line, use the right backup for your recovery model and method.

This relates immediately to a server install I was performing, but is something to keep in mind.
1. Don’t try to use SSRS (SQL Server Reporting Services) Reports in GP if SQL is an Express version of 2005, 2008, or 2012. According to GP 2010 System Requirements, SQL Express is supported for everything but Analysis Cubes. I spent too much time trying to get it to work until I came across a Blog post flatly stating that it does not work. Even running GP as a local administrator and logged in with the SQL sa account, I continued to get the error that I did not have permission to install to that location. I did set security in the Reports Site Settings (you need to run IE as administrator the first time to add yourself), gave our user, domain admins, and domain users full access to everything, then repeated the permission settings for folder security. I even went so far as to set security permissions on the file folder structure in Windows Explorer – same error. As I stated, the Blog post was sufficient to put a halt to my efforts.
2. The client wanted to use SQL 2012, which for GP2010SP3 supports. However, Management Reporter would not validate on SQL 2012 to allow creation of the databases. In this case, Management Reporter System Requirements DO state SQL 2005 or 2008 are supported, with no mention of 2012.

One additional note – You will have little or no trouble installing Management Reporter Server components if you download and pre-install the Access 2010 Runtime and Access Runtime SP1 from Microsoft Downloads. If you don’t run those first, setup will hang on the MR Server configuration windows. If it’s a 64-bit OS, install 64-bit Access Runtime and SP. If Office 32-bit is installed on the server, you cannot just run the 32-bit Access Runtime install, because it senses a mismatch between the Windows Server and Office platforms. The only option is to uninstall MSOffice 32-bit and then install the 64-bit Access Runtime. You will then need to install Office (Word, Excel) as 64-bit.

This was an interesting project, one I have not had to deal with for a few years.  A client was still running a single-user Great Plains 6.0 Ctree on Windows 98 (remember that?).  Fortunately, I still had the migration chunk file so it was doable.  First order of business was getting a copy of the code and data off the old PC, and ended up pulling the drive, attaching it to my notebook via  a USB adapter, and copying the folder from there to a virtual Windows XP Mode install on my notebook.  The Ctree version fired right up, and I verified data.  Upgrade to 7.5 was simple enough, except when I continued to get Pathname errors running Shrink on PM and GL history files.  I verified that the files existed and GP produced historical reports.  The problem was that during installation of GP 7.5, Utilities did not synchronize to the existing account framework.  As soon as I changed the dex.ini flag to TRUE and forced synchronization all was well.  Data upgrade was flawless as was Check Links.

On to migration.  I needed to use MSDE (SQL 2000 Express) for the database, which is also why I needed XP Mode.  I created the empty mirror database and ran the push migration from 7.5 Ctree.  The migration ran for 16 hours, and the data set was less than 500,000 records!  The processor was maxed on the virtual machine, which is why the excess time, but I didn’t want to kill the process, change processor and memory parameters for the VM, restore and restart.  I need to remember that for next time.

The upgrade to 8 was textbook.  On to 10 – a new problem.  Component installation failed on SQL Server Native Client 9.0 and would not complete.  There was nothing in the error logs.  I then tried installing just the SQL Client from the Bin folder, and finally received a usable error – installation was not permitted on a terminal session.  The fix was to disable integration components in XP mode and restart the Virtual PC.  Success!  Remember to re-enable integration components so you can see your local Windows 7 drives.

The upgrade processes to 10 and finally 2010 were non-problematic, and running SQL backups between upgrades ensured a successful conclusion.