Since there are already quite a few tutorials detailing how to build an Web API in ASP.NET Core, I thought I'd do something a little different. Instead of a full tutorial on developing an ASP.NET API, I want to go over a list of issues I encountered while developing an powerful product with this technology.

Let me give a bit of back story on the project. My company was contracted to develop an educational mobile game. I was tasked with developing the API, the database, and a desktop tool for granting administrator access to the game's content. The project was completed using ASP.NET Core version 1.1.2, please note that the issues I mention may reflect my version, and might be resolved in the future.

In Memory Database ORM Problems

I encountered this issue very early into the project. It's fairly common for ASP.NET Core tutorials to suggest in-memory databases for testing and ease of use. It seems like a great tool, but the ORM layer written for in-memory database is slightly less sophisticated than the one written for Microsoft SQL database, which poses an issue. If we try to map child data within parent data to an in-memory database, the child data will not be mapped. Here's an example in JSON for better visibility:

{
  "Event_ID": 0,
  "name": "Cool Event",
  "county": "Orange",
  "Booths":[
    {
      "Booth_ID": 0,
      "name": "Medical Booth",
      "staff": [
        {
          "name": "John"
        }
      ]
    },
    {
      "Booth_ID": 0,
      "name": "Food Booth",
      "staff": [
        {
          "name": "Mary"
        }
      ]
    }
  ]
}

If we try to map this data with the MSSQL ORM, all child objects (in this situation: booths within events, and staff within booths) will be properly stored in the database. However, if we try to do the same thing with the in-memoy database ORM layer, only the top most layer of data is mapped (in this situation: the event). To avoid this issue, you must use the MSSQL ORM system by installing MSSQL on your development machine and changing the dbContext of your project from:

public void ConfigureServices(IServiceCollection services)
{
    ...
    services.AddDbContext<App_Context>(opt => opt.UseInMemoryDatabase(connectionString));
    ...
}

to

public void ConfigureServices(IServiceCollection services)
{
    ...
    services.AddDbContext<App_Context>(opt => opt.UseSqlServer(connectionString));
    ...
}

Unfortunately, this will leave you with a persistent database, which could cause issues when updating db model properties, which happens often, early on in a project's development.

There is an alternative to using a persistent database that will still map child data. This involves supplementing the missing mapping functionality ourselves, but it really doesn't make sense to do this if we're going to end up deploying for MSSQL anyway.

Extended Controller Endpoints

While working on the controllers for the API, I wanted to add specialized endpoints to make the front end developer's lives easier. Some of these special endpoints will conflict with existing endpoints within a given controller. Consider the following use case, keep in mind that in this example, each "Booth" has a one-to-zero or one relationship with each "Prize" and is its parent:

//Route: GET api/Booths/{booth id}
[HttpGet("{id:int}")]
public IActionResult Get(int id)
{
    ...
    var booth = dbBoothRepository.Find(id);
    return new ObjectResult(booth);
}

This action handles the GET request for taking in an id and returning the corresponding object (a booth) for this controller. But say you want an additional action that returns the Prize at a given Booth using a Booth ID at the endpoint: api/Booths/PrizeAt/{booth id}.

//Route: GET api/Booths/PrizeAt/{booth id}
[HttpGet("{id:int}")]
public Prize Get(int id)
{
    ...
    return dbPrizeRepository.PrizeAtBooth(id);
}

Having this makes implementing the API a lot easier. However, you cannot have 2 actions that handle the same type of request and take in the same arguments within a single controller. It creates ambiguity when Entity Framework tries to route requests. You may attempt to fix this by adding the Route data annotation to your actions, which would look something like this:

[HttpGet("{id:int}")]
[Route("api/[controller]")]
public IActionResult Get(int id)
{
    ...
    var booth = dbBoothRepository.Find(id);
    return new ObjectResult(booth);
}
[HttpGet("{id:int}")]
[Route("api/[controller]/PrizeAt")]
public Prize Get(int id)
{
    ...
    return dbPrizeRepository.PrizeAtBooth(id);
}

This should fix ambiguity when routing is attempted, but in the end, we have 2 methods of the same signature in the same class, which means an error. It would be much neater if data annotations were able to change a method signature, but they are not.

So on to the solution. You must create another controller and modify its route value to contain the route of the controller who's endpoint you are extending. Then add the corresponding action. I know that sounded complicated, but it's a very simple solution:

[Route("api/Booths/[controller]")]
public class PrizeAtController : Controller
{
    ...
    [HttpGet("{id:int}")]
    public Prize Get(int id)
    {
        return dbPrizeRepository.PrizeAtBooth(id);
    }
}

This will now function as we want it to. It responds to a GET request with a single id property at the route: api/Booths/PrizeAt/{booth id} by returning a Prize which corresponds to the Booth id provided.

Deploying While Keeping Costs Low

Deployment problems sting so bad because the project works fine on your development environment, but unforeseen problems wont let it function on production. And if it doesn't ship, it's worthless.

Due to budgetary restrictions, we offered our client hosting on DigitalOcean's cheapest server instance. Members of our team (myself included) have experience with DO and didn't foresee any issues.

That was a big mistake. When I attempted to install MSSQL on the server instance, it wouldn't allow me to complete the configuration. It turns out that MSSQL requires 2GB of RAM to install, and its cheapest instance only has 512MB. Upgrading DO to 2GB of ram would have cost our client 4x as much each month in hosting costs. We try to deliver our products at the lowest cost possible, so we looked for solutions.

I came to the conclusion that we'd have to use a different DBMS, and chose MySQL for its great documentation. It installed to the server instance just fine. But solving programming problems is like killing a hydra, once you solve one, two more pop up.

I needed to switch to the MySQL ORM layer so entity framework could interact with the database, but 3rd Party ORM layers are developed by the 3rd party themselves. I guess Oracle didn't see this as a priority, because they still hadn't released a package with entity support. Luckily for me, the Pomelo Foundation had developed one to work with MySQL. However, this came with some drawbacks.

The most glaring issue was the lack of primary key incrimentation. This worked fine on MSSQL, but wasn't functioning on Pomelo's MySQL package. I'm pretty sure it had something to do with the package not reading data annotations propery. Our inelegant solution was to simply provide the primary key as a random number between 0 and the max integer value (231-1). I even added in a check to see if the randomly generated id already existed, even though the chances of this happening are well over 1 in a million.

public static int getId()
{
    return random.Next(int.MaxValue - 1);
}
var key = Identity_Manager.getId();
while (repository.Find(key) != null)
{
    key = Identity_Manager.getIdHash();
}

Though crude, the solution fixed our issue without any lasting detriments.

That's it for all the issues I ran into while working on the API. I hope this was helpful.

-Mr