Over-Engineering Happens

Aleksandr Guidrevitch
darwinapps
Published in
4 min readSep 8, 2017

--

Last year, we had a client with a nearly 20-year old online store written in Visual Basic who wanted to upgrade to a modern, beautiful, and obviously mobile-friendly version.

Initially, this project looked like a piece of cake for us, as we’ve delivered on these kinds of projects time and time again. But once we investigated their setup, all the small details started to pop up. It turned out that they were using Atrex Inventory Tracking / POS software by Millennium Software to track their inventory — a Windows desktop application. The inventory itself was exported to an MS Access database file (MDB) once a day, uploaded to the FTP server where the store was running, and the store’s Visual Basic scripts would work directly with this uploaded MBD file.

Before signing a contract, we needed a proof of concept from our developers that we would be able to import products into the new store running in a Linux environment.

Reverse Engineering

It turns out Atrex has an E-Commerce tool which can export data into Zen-Cart by uploading to FTP then running some import scripts crafted by Atrex team. We didn’t want to build the store on Zen-Cart, but this was the only tool we had.

With the help of reverse engineering and power of FTP logs, we were able to collect, decrypt, and decipher the import and export scripts the tool uploaded along with the formats of incoming and outgoing data.

The tool uploaded PHP files and data into the /atrex/ FTP directory, and then triggered https://customer.com/atrex/import.php through the web. That created 2 problems for us:

  1. FTP servers expose their web root directory with all files listed, and this is not a good idea, given that an untrained FTP user can break the site in the blink of an eye.
  2. As we at DarwinApps are not running Zen-Cart, we needed our custom /atrex/import.php script, but it could be overwritten by the tool. If we restricted write permissions on /atrex/import.php, the tool would fail because it wouldn’t be able to upload the required files.

The idea popped up almost instantly, it seemed we needed a custom FTP server.

Custom FTP Server

The server was supposed to address these two issues in the following way:

  1. Expose only selected directories to users (we needed only /atrex/ in the beginning), and
  2. Report successful upload of import.php to Atrex tool (but not actually overwrite it).

Additionally, it was supposed to expose images and templates directories for the client’s staff to do their job, but keep the staff from the site’s source code. Another extra function of the custom FTP server was to create backups of the incoming data files when they are deleted or overwritten.

We spent around 24 hours to build our own FTP server on the base of perl and its Net::FTPServer package. It worked most of the time, but sometimes it failed for no obvious reason:

Unfortunately, we found nothing useful in the logs to determine why these failures were occurring. After 2 unsuccessful attempts at around 8 hours of investigation, we decided it would be wise to stop wasting our engineer’s time and to find some other solution. We ended up spinning up a proftpd server, using the following setup:

  1. proftpd points to some location outside of the web root directory, so that everything uploaded to FTP is never accessible through the web,
  2. Our /atrex/import.php is safe from overwriting, as it is outside of FTP root, and
  3. Backup functionality implemented in /atrex/import.php

Why Did This Happen?

It’s obvious that we should have spun up the proftpd in the very beginning, but for some reason , we didn’t. Looking back, I would say we:

  1. Were eager to demonstrate we’re capable of building virtually anything, and
  2. Didn’t spend enough time looking for a simpler solution.

As a result, we over-engineered this part of the project.

How much did it cost in the end ?

For the customer, it cost nothing. We had a fixed price contract, and this issue was addressed within the free post-production support plan.

For DarwinApps, this cost us around 24 hours of a senior software engineer’s work for the initial FTP server’s implementation, around eight hours for investigating problems, one hour to implement backup-before-import functionality right in the import.php, and two hours of a DevOps engineer’s time for replacing our custom FTP server with proftpd and reconfiguring the store application.

Quite the high price for this lesson! But in the end, we’re all happy that we learned and fixed something on our side in the process.

--

--