Exchange Mobile Device Issues

A user deleted and recreated their mobile account which landed their mobile device into quarantine but it was unable to be approved. Also, quarantined devices for users who have been deleted or disabled cannot be removed using the GUI.

The issue seems to be that there are two sets of data to keep track of mobile devices on the server. The device could not be approved, because it was already approved and was stuck in quarantine. Rejecting the device added it as an unapproved device for the user.

First I needed to delete some “zombie” devices. Step one was to set Powershell scope to the whole forest:
Set-AdServerSettings -ViewEntireForest $true

Then get a list of mobile devices:
get-mobiledevice |fl UserDisplayName,Guid,DeviceType

I then pick out the phone that I need to delete and remove it:
Remove-MobileDevice -Identity

For the next issue, I need to Get the User’s ID:
get-casmailbox |fl Identity

Then I need the device ID that I need to change (this will be the same Device ID in the management GUI for Exchange):
get-casmailbox -Identity |fl ActiveSyncBlockedDeviceIds,ActiveSyncAllowedDeviceIds

Then set delete/add the device to the Blocked or Allowed lists:
Set-CASMailbox -Identity -ActiveSyncBlockedDeviceIDs @{remove=”}
OR
Set-CASMailbox -Identity -ActiveSyncAllowedDeviceIDs @add=”}

CSI/Syteline “Customer Since”

One of the more lightly requested modifications to the Customers form in Syteline was for there to be a way to tell how long someone had been a customer (for credit check purposes). I rigged this up to work by first adding a field for CreateDate to the Customers form (which is easy since it’s already in the IDO for the form):

I then execute a SQL query to change the CreateDate to be the “customer since” date (going forward these should be the same, but for legacy customers the date would just be whenever they were added to the database):

UPDATE customer_mst SET CreateDate='1/1/20' WHERE cust_num='TEST' AND site_ref='TESTDB';

Notes on Scheduling and Costing in CSI/Syteline

We had some initial struggles getting running on our CSI implementation so we started with one of our smaller operations. We had very little guidance with several nuances with the package (it’s a long story that’s not fit for print) so I’m going to touch upon them here, but, the overarching rule is thus: routers need to be correct so that both costing and scheduling is correct. If the routers are bad, both the costing and scheduling will not work as designed in CSI.

To begin, all manufactured parts in CSI need a router to be setup up in Current Operations and Current Materials. In other systems these are called the “job templates”, though in CSI there is only one ‘true template’. I will point out that this mostly applies to the APS flow in an actual costed environment. I do suppose that it is possible to use a standard costed system and ignore the scheduler and in that case perhaps the ‘Currents’ are not needed. However, I think that is still bad form since the system will then not have a baseline to compare actual production values to. It is possible for the numbers to be compared to the standard but the standard is just an educated guess made by an, no doubt talented, accountant while the “planned cost” that is derived from the router is a ‘guess’ made by production.

Properly constructed routers have another benefit too: if the job cost variance report shows a very odd “planned cost” then that probably means that there is an issue with the router that may show up in scheduling the job. In this way production and accounting get actionable data out of the same single data set.

As well, this planned cost is critical in an actual costed environment. In CSI if a partial job quantity is moved in to inventory, how does CSI know how to cost it since not all the costs have been incurred on the job? The answer: it uses that planned cost. Thus it is critically important that the planned be as close to the actual as possible (think of it like a standard cost that isn’t actually standard). When the job is closed out the difference between the planned and the actual is sent to the inventory adjustment account in CSI. The correct sequence for inventory moves when using actual costing in CSI then is:

  • If the job is being moved into inventory complete, then close the job and move the inventory at the same time.
  • If there have been any partial moves from the job into inventory, then move all the parts as partial and close the job out separately (within the Job Orders form).

I found the operation and tuning of APS itself pretty straightforward with two notable exceptions:

  1. Forecasting does not work. Well, I take that back, there are a narrow set of circumstances where the Forecasting function might work but I’ve been unable to find anyone that it would work for. The workaround for this is to put a bogus sales order into the system that serves as the forecast (and manually deducting the forecast as its consumed).
  2. APS handles Setup times very poorly in that it doesn’t roll the Setup resource group off of the job after the setup is complete. Part of this probably has to do with the fact that CSI has no way to know if a setup is complete. The workaround for this is to put the setup on the router as a separate operation. This can lead to some wonky views in the scheduler since there is no guarantee that the machine being setup is the same one that will be doing the run, BUT, the throughput will be correct.

We also ran into two other issues that were no fault of the package. In one instance, some of our routers were very inaccurate and it was very difficult to get them fixed so the planning people resorted to marking the parts as MRP which forces APS to use whatever lead time it is told. The other issue had to do with a change in methodology since it’s important that the operations are closed out as they are completed, otherwise APS leaves the whole time scheduled. This was something we had to work on with the employees so that they knew when (and when not) to close out an operation.

Notes in CSI / Syteline

With our implementation of CSI/Syteline it was important to bring over our work instructions which meant extracting the notes out of the old system and putting them into the new one. The issue here is that there is no real method to do this within the system so direct SQL access is required in order to pull this off.

Please keep in mind that being a SQL DBA is one of my many part time jobs and not a specialty, so some of these steps may not have been as efficient as they could have been.

For our two different production systems the notes were stored using two different ways, though the process for putting them into CSI was basically the same:

  • For JDE our notes were integrated into the routing line and due to the structure they required some re-processing. Fortunately these routers were very stable so I didn’t have to worry about a “live migration” of the notes. Using a brief piece of code I extracted the lines and formatted them to Excel (although theoretically the same steps could be accomplished in Excel without using code).
  • For our JobBOSS notes I needed to be able to pull these over at or near go-live. This meant that I had to do some very light processing of the note text within Visual Studio.

In CSI there are three tables of importance:

  1. NoteHeaders – in this table is a list of what “token” goes to which table. Since these are work instructions on the router the notes will be associated with the jobroute table. The important step here is that at least one note needs to be created to generate this entry. There are other possible ways, but this is the simplest method: make a note on an operation line, and then check in the NoteHeaders table to see what number goes to jobroute. The note can then be deleted.
  2. SpecificNotes – This table will have the note text (and note title) that will contain the work instructions
  3. ObjectNotes – This table will have the note key associated with the note text in the SpecificNotes table, the table the notes goes to (using the number from the NoteHeaders table) and the row pointer to the record in the table that it is associated with.

A complicating factor for our JobBOSS implementation is that we were required to rework and/or resequence the router lines (for instance something that was “Op100-CNC” in JobBOSS might be “Op80-Mill” in CSI). If nothing was changed on the routers it would be possible to “brute force” the note entry by adding all the notes, pulling the row pointers in sequence and then building a query from that to add the associations in the ObjectNotes table (because if I had 100 router lines, and 100 notes then I’d know that the 50th note in the SpecificNotes table goes to the 50th record in the jobroute table). However due to the changing router I had to handle the notes using the same code in C# in Visual Studio that I was using to put the routers together, since only that code knew what the sequence would be.

The first step then was to extract the router lines from JobBOSS. I used a LINQ query like this:

var operations = from o in jo.Job_Operations
                 join o2 in jo.Jobs on o.Job equals o2.Job1
                 join o3 in jo.Materials on o2.Part_Number equals o3.Material1
                 where o.Status.Trim().Equals("T") && o2.Priority<9 && o3.Status.Trim().Equals("Active")&&o2.Part_Number!=null
                 orderby o3.Material1,o.Sequence
                 select new { o.Job,o3.Material1,o.Sequence,o.Description,o.Est_Setup_Hrs,o.Inside_Oper,o.Operation_Service,
                                   o.Run,o.Run_Method,o.Vendor,o.WC_Vendor,o.Work_Center,o.Last_Updated,o.Last_Updated_By,o.Est_Unit_Cost,
                                    o.Note_Text,o.Cost_Unit};

The “where” clause is limiting the jobBOSS jobs to be pulled over to templates (“T”) that go to Active part numbers.

Many, many lines of code later I get to the note section. Due to the nature of the notes tables I was unable to get the code to insert the notes directly into the database and instead had to write out a query out to a text file that I then ran in SSMS. I cycled through each line in the router and wrote out an Insert query for each note:

string sqlCmd = "INSERT INTO[SpecificNotes](SpecificNoteToken, NoteContent, NoteDesc, NoteExistsFlag, " +
"CreatedBy, UpdatedBy, CreateDate, RecordDate, RowPointer, InWorkflow)" +
"values((SELECT ISNULL(MAX(SpecificNoteToken) + 1, 0) from dbo.[SpecificNotes]),'" +
n.NoteText + "','WORK INSTRUCTIONS', 0, 'sa', 'sa', DEFAULT, DEFAULT, DEFAULT, 0);";

I wrote the lines of text out as they were generated and then copied the contents of the resulting file and pasted them into SSMS. Before running it I had to put the text “SET IDENTITY_INSERT SpecificNotes ON” at the beginning and “SET IDENTITY_INSERT SpecificNotes OFF” at the end.

With my notes added to the database I then rerun the code to generate the Current Operations lines. This time, a different piece of code in the routine goes out to find a note that is now in the CSI database that matches the note line held by the object of the Current Operation line:

//get notes from CSI
var snotes = from s in si.SpecificNotes
            select new { s.NoteContent, s.SpecificNoteToken };
List<sNotes> sNoteList = new List<sNotes>();
foreach (var s in snotes)
            {
                sNoteList.Add(new sNotes());
                sNoteList.Last().Token = (int)s.SpecificNoteToken;
                sNoteList.Last().Note = s.NoteContent.Trim().Replace("'", "''"); ;
            }
//Marry the notes in the note DB to the process lines, OYE!
foreach (var n in syteCurrOps2)
            {
                foreach (var s in sNoteList)
                {
                    if (!s.Used)
                    {
                        //Note: essential that the notes read out of the database match the ones inserted (i.e. avoid whitespace issues!)
                        if (n.NoteText.Trim().Equals(s.Note))
                        { n.NoteNum = s.Token; s.Used = true; worker.ReportProgress(i++); break; } 
                    }
                }
            } 

Now that I know which note in the database goes to which router lines, I need a line in the ObjectNotes table to match them up. Restating a key point here, a normal note must be manually added to a Current Operation line so that CSI will know which table the note is associated with. Just add a line and then query the lines in the NoteHeaders table to see what number must be used on the SQL Insert lines for the ObjectNotes table:

In this case in the code I have a number 4 in the insert statement, but I would want to change it to a 1:

//NOTE: change #4# below to the number specified in the NoteHeaderToken field in the NoteHeaders table
//      This number is set when the "sample" note was created earlier (which is why a sample note must be created before insert)
string sqlCmd = "INSERT INTO [ObjectNotes](ObjectNoteToken,NoteHeaderToken,RefRowPointer,NoteType,SpecificNoteToken," +
"NoteExistsFlag,CreatedBy,UpdatedBy,CreateDate,RecordDate,RowPointer,InWorkflow)" +
"values((SELECT ISNULL(MAX(ObjectNoteToken) + 1, 0) from dbo.[ObjectNotes]),4,'" + n.RefRow + "',0," + n.NoteNum + ",0," +
"'sa','sa',DEFAULT,DEFAULT,DEFAULT,0);";
sw.WriteLine(sqlCmd);

Just as with the notes query, I write this query list out to a text file so that they can be run in SSMS (and again setting the “SET IDENTITY_INSERT SpecificNotes ON/OFF” lines before/after the query).

The last step is that since all these notes were entered in SQL and not the interface, CSI needs to know which specific Current Operation lines have notes. We do this by setting the bit operator for the NoteExistsFlag field to one. So again, I just dump out a a series of lines to put into SSMS to update all of the lines based on the row pointers for the lines that now have notes:

foreach (var n in syteCurrOps2)
    {
        if (!n.RefRow.Equals("") && n.NoteNum > 0)
          {
              string sqlCmd = "UPDATE jobroute_mst SET NoteExistsFlag=1 WHERE RowPointer='" + n.RefRow+ "';";
              sw.WriteLine(sqlCmd);
              worker.ReportProgress(i++);
          }
          else if (n.RefRow.Equals("") && n.NoteNum > 0)
              logger("Rowless line: " + "Item: " + n.Item + " Seq " + n.Seq);
    }

So in the end there are three sets of SQL codes exported, one for each of the two related Notes tables, and one for the table that is having the notes added to it. The last line of code in that snippet logs lines that the code thinks should have a note, but couldn’t find one. This typically happened due to odd characters (‘,”,*, etc.) or white space mismatch issues, but over all it was quite reliable.

Slightly off topic, but you may see the “worker.ReportProgress” line in some of the code. Due to the fact that some of the queries took a long time to run I had to put my main methods into new threads (multi-threading). Every now and then in the code I put these ReportProgress lines in so that my (very) inexact progress meter would update on the interface to let me know it was running (and to give me some idea of where it was at in the execution process). So under my code for my “export button” I had something like this to launch the Current Operation export process:

//extracts have to launch in new a thread so that the parent thread won't hang and progress can be tracked
worker.WorkerSupportsCancellation = true;
worker.WorkerReportsProgress = true; 
if (comboBoxExpType.Text.Equals("CurrentOps"))
                worker.DoWork += new DoWorkEventHandler(exportCurrentOps);
worker.RunWorkerCompleted += new RunWorkerCompletedEventHandler(worker_RunWorkerCompleted);
worker.ProgressChanged += new ProgressChangedEventHandler(worker_ProgressChanged);
worker.RunWorkerAsync();

ERP Evaluation

It was a bit of a slog but we recently finished our implementation/conversion to Infor’s Cloudsuite Industrial (aka CSI, aka Syteline). Our legacy products were JDE, Mas90 (aka Sage 100 ERP), and JobBOSS. Before I get into some of the nitty-gritty I wanted use this post to go over our evaluation process.

First, we arrived at the decision to implement a new package for a variety of reasons:

  1. Our implementation of JDE was running on server software which would soon be completely unsupported, making it difficult for us to secure it and have the package still be usable. It was also prohibitively expensive to upgrade for a company of our size.
  2. We had grown through acquisition so now there were a couple different packages to manage.
  3. We’re an audited company and the accounting functions within the JobBOSS package made an already tedious process more painful that it should be.
  4. In some ways we had outgrown JobBOSS. In particular the scheduling functionality was quite inadequate with what we were trying to accomplish

We started, of course, by evaluating several different packages. It’s interesting and disappointing that these packages all have an easily deployed Demo environment, but none of them will let you touch the package until you purchase it. In lieu of first hand usage we were left with working from a spreadsheet with our criteria on it with various ratings. If a package didn’t meet some criteria, like tracking machine time, it was a deal breaker, while other items, like the ability to track incentive pay, were preferred but not enough by themselves to rule out a package.

As an aside, while preparing for this process I went through various articles on the Internet and some of them said that such checklists aren’t really a requirement since any ERP can be modified to do what you want them to do. This is obviously consultant talk and, more or less, a lie. For instance, some systems didn’t support actual costing, and what would be the fix for that? Rewriting nearly the entire ERP package?

Anyway, this is where the lack of a demo came back to bite some of the contenders since we ended up favoring a package we at least had some first hand knowledge of (having previously owned old plants that used an ancient version of Syteline). All the ERP salesmen were great at running the “executive demo”, but anything remotely specific required a call with a consultant (while aggravating in some ways, at least the team that was selling Syteline was more prompt at doing this than anyone else). Lastly, Syteline was the only package that checked all of our “critical” tasks and most of our important ones.

For the implementation itself we relied on the business skills of the great people at KJC Consulting, but for menial technical work we did a lot of that in house. Although we tried working with an earlier build of version 9 CSI, the later 9.1 version that we went live on was much more user friendly in terms of deployment and we had good luck loading all of our data via the grid views in CSI without having to involve outside parties (there were a few minor exceptions to this, notably loading the item stockroom locations via SQL saved a bunch of time for the physical inventory data load).

Although loading the data is fairly straightforward, getting it out of the legacy systems in a format that CSI could digest was quite a bit more work. For this I relied heavily on LINQ and C# to extract and format the data out of JobBOSS (and the notes from JDE; JDE was close enough in formatting that we were able to use spreadsheets to bring over a lot of the data). I dreamed of cleaning up this code to make it more general purpose, but alas, it is a royal mess, full of workarounds, one-offs (huge pieces of the operation line export had to be dumped and rewritten to the needs of the specific plants), and of course known bugs.

Some of it might be helpful though, so over the coming weeks I am planning to put some code snippets up in case anyone finds themselves in the same spot that I was, just on the off chance that it might help.