I was working at Builder’s Square back in the mid 80s and got tasked with converting the existing payroll process for the company. We used large 11 x 17 forms for the employees to fill out which was then sent out to ADP. These forms had information such as address, bank routing numbers for direct deposit, with-holding for taxes, etc. They also sent the weekly hours for store employees out to ADP for them to cut checks.
Builder’s Square was a subsidiary of K-Mart and the word had come down that K-Mart’s payroll department could do the work. They had a custom record layout already created for us to submit all that data on those large forms up to K-Mart in Michigan.
There was also a developer up at K-Mart who was taking the uploaded data and importing it into their payroll system. We also had a payroll manager up at K-Mart who did the project management and coordination. There was also a couple of trainers at Builder’s Square who went out to the stores and showed the payroll clerks what needed to be done.
I had to work with developer at K-Mart to include anything on the ADP form into their record format. Also they had assumed we’d always send everything on every record but I modified the format to fill any field not actually being updated with null values or low values. No sense in causing an update for a field that hasn’t changed.
I created data entry programs to update the employee data and allow the store payroll clerks to enter the hours. Any field that was changed got logged into a transaction file to show when it was changed and who changed it. This file was then extracted at the end of the week to create the change records to K-Mart. The change records and payroll hours where sent via model to K-Mart every Sunday which took about two hours.
Working with the Avery form vendor rep the two of us modified that large payroll form down to 8.5 by 11. It was still used but instead of sending the forms off to ADP they now sent them to Builder’s Square corporate payroll. Modifying that paper form also allowed me to match up the green screen data entry programs to match the form layout.
We used a dedicated modem to transmit the changes and payroll hours each week to K-Mart. The transmission took about four hours on a Sunday which was acceptable. I also had to code the transmission programs to hand shake over SNA to the K-Mart mainframe.
The next day K-Mart would send down the current employee data file which we would continue to maintain at Builder’s Square. This kept the two systems in synchronization. K-Mart had the master copy of the employee data while Builder’s Square had a local copy. If any problem had occurred where we missed a change or something it would be obvious and we’d have to key the changes in again.
So we got everything working and folks were getting paid right. Occasionally there would be a problem with the store connection going down. I had a workaround where the store payroll clerk could drive to a nearby store. I had used the device description the payroll clerk signed in on to limit them to just that store’s employees. When a payroll clerk would go to another store I’d do a temporary override to let them maintain one store’s employees from another store. It wasn’t very pretty but it worked. With a bit more time I’d have given that override ability to the corporate payroll department and gotten myself out of the loop.
At the end of the project Mike Smith who was the VP if IT at Builder’s Square made a comment to me that ADP did the payroll cheaper that K-Mart. I got the impression maybe he didn’t want us to succeed with the project. I replied that it was in house and was probably cheaper for K-Mart. He’s looking at it from Builder’s Square perspective. I looked at it more from K-Mart’s point of view.
Was Builder’s Square payroll a subsystem that was needed to be optimized for K-Mart or was it a system that should have been left alone as the conversion to K-Mart wasn’t a cost improvement?
In Web API Design they say on page 13 to “Never release an API without a version and make the version mandatory.” I don’t agree with this statement.
If I’m a developer I want the freedom to include or exclude a version from the URL. Stick the version on the end and make it optional. If I want to always get the latest and greatest from the API then I can leave off the version. Maybe I’m just doing testing and always want the latest. It’s possible the API may break but that should be up to me to decide. Don’t force me to include the version by making it mandatory.
If someone is doing continuous deployment I could have many small changes in the API. Usually the code supports both the old version and the new version of the API. Even if I’m forced to include the version in the API the old API version can still break. Putting the version in the API will reduce the chances of breaking the API but it’s not completely avoidable.
A version number is really just an alias for a date/time stamp. If I’m doing continuous deployment I won’t declare a version until I think enough changes have been released into the API to warrant it. If you allow a date as well as a version in an API a client can use some intermediate slipstream version. So, using their account examples a versioned URL could be:
- /account (always the current version)
- /account/20140524015300 (ISO date format as repeated in RFC3339 but without the required date separator, T & Z)
- /account/20140524 (simple ISO date)
- /account/05242014 (USA date format of MMDDYYYY)
- /account/V1 (with V1 as an alias to something 2014-05-24T00-00-00Z)
- /account?v=1 (The facebook version as a query parameter is ok too)
It’s also possible to include an Accept-Datetime in the request header (why isn’t this date an ISO or RFC3339 format?). If there are mutiple versions present I’d say the precedence is URL, query parameter, and header.
You could also include a version in the Accept content type. But you could have a mime type version independent of the resource version. I’d be cautious about using a version in a mime type.
I’d accept any and all possible versions to make the API more robust.
Comments about versioning I referenced while writing this:
When I worked at the Price Club we started looking for a replacement merchandising system. Oddly enough the project was called NMS for New Merchandising System.
One of our requirements was for Average Weekly Sales. We looked at a software package used by Boscovs which is a regional department store in Pennsylvania. It appeared to fit the bill for our requirements. We asked if they had Average Weekly Sales as a check off on our requirements. The vendor replied yes. Awesome, check that one off. Other questions got good answers so we thought we had a good fit.
When we got the software installed we soon found out that we really required Average Weekly Sales by Item by Warehouse/Store. Boscovs, being the regional retailer they were and not too spread out, only had Average Weekly Sales by Item. Not by location? Whoops! We didn’t go deep enough into their data model to discover that issue. We might have also assumed that everyone had Average Weekly Sales by location so didn’t bother to ask if it was by location.
The idiom “the devil is in the details” comes to mind.
So my lessons learned from this was:
Look at the underlying data model for a software package to match up your requirements. They had Average Weekly Sales but it wasn’t an attribute on the right entity in the data model.
Be careful about any assumptions (even hidden) you might be making.