EventStore on Azure and Ubuntu - it's a piece of cake! #1

EventStore is a well known, open-sourced and a solid database designed to be the very foundations of event-driven systems. What is great about it is the fact, that it can be built against both Windows and Ubuntu systems, what widens technology stack it can be used with. If you prefer Linux solutions and would like to build an event sourced solution based on ES, there's nothing that will stop you. In this short series of posts I will present how to quickly install, configure and manage EventStore using Ubuntu VMs from Azure.

Getting VM

You can obtain Ubuntu 14.04 VM from the marketplace in Azure Portal. There's nothing special about its configuration or size - for the purpose of testing it can be whichever you like and you're comfortable with. Once you fill in all fields and provision the whole environment, we can connect to the machine and try to install the database.

Installation

To connect to the VM you need an SSH client and credentials you provided during VM installation process. I personally recommend using PuTTY in Windows environment since it's lightweight and completely free. Once you're logged in, we can start installing EventStore instance.

Firstly run following command:

/
curl -s https://packagecloud.io/install/repositories/EventStore/EventStore-OSS/script.deb.sh | sudo bash

Once you have EventStore preconfigured, you can install it:

/
sudo apt-get install eventstore-oss=3.9.3

You can choose any version you like, in this particular post I selected 3.9.3 since it was the most recent one available.

Once EventStore is installed we can run it using this command:

/
sudo service eventstore start

and use curl to sent testing event to make sure everything is all right. To make things easier, take following JSON from the documentation:

/
[
  {
    "eventId": "fbf4a1a1-b4a3-4dfe-a01f-ec52c34e16e4",
    "eventType": "event-type",
    "data": {
      "a": "1"
    }
  }
]

and use following command to send an event:

/
vi event.txt
curl -i -d @event.txt "http://127.0.0.1:2113/streams/newstream" -H "Content-Type:application/vnd.eventstore.events+json"

Note that we're using vi to quickly create events.txt file using JSON from above. When you execute the command, you should receive HTTP 201 Created response:

/
HTTP/1.1 201 Created
Access-Control-Allow-Methods: POST, DELETE, GET, OPTIONS
Access-Control-Allow-Headers: Content-Type, X-Requested-With, X-Forwarded-Host, X-PINGOTHER, Authorization, ES-LongPoll, ES-ExpectedVersion, ES-EventId, ES-EventType, ES-RequiresMaster, ES-HardDelete, ES-ResolveLinkTo
Access-Control-Allow-Origin: *
Access-Control-Expose-Headers: Location, ES-Position, ES-CurrentVersion
Location: http://127.0.0.1:2113/streams/newstream/0
Server: Mono-HTTPAPI/1.0
Date: Wed, 22 Feb 2017 08:09:26 GMT
Content-Type: text/plain
Content-Length: 0
Keep-Alive: timeout=15,max=100

Note that the configuration file used by EventStore is located in /etc/eventstore/eventstore.conf and since it's read-only, you will have to use sudo command to change something in it. For now, leave it as it is.

What's next?

In the next posts I will present how to access EventStore from your local computer and what to change to be able to send and receive messages from it. We'll end this series running a simple cluster of EventStore instances on 3 different Ubuntu machines.

Managing your git repository via REST API in VSTS

Let's say you'd like to store some results of a build or a release inside a repository. It could be any reason - easy access, versioning, the only tool you have access to. By default VSTS doesn't provide any kind of git-related steps, which could be helpful in such case. Fortunately, once more its REST API comes to the rescue, giving us an opportunity to fully manage repositories with the possibility to push multiple commits.

Before we start, take a look at the overview to understand what are the general capabilities of this API.

Making an initial commit

Once you have your git repository created in VSTS, you can either initialize it with a commit from the GUI or push an initial commit using API. If you take a look at the example from the documentation:

/
POST /_apis/git/repositories/{repository}/pushes?api-version={version}

and its body:

/
{
  "refUpdates": [
    {
      "name": "refs/heads/master",
      "oldObjectId": "0000000000000000000000000000000000000000"
    }
  ],
  "commits": [
    {
      "comment": "Initial commit.",
      "changes": [
        {
          "changeType": "add",
          "item": {
            "path": "/readme.md"
          },
          "newContent": {
            "content": "My first file!",
            "contentType": "rawtext"
          }
        }
      ]
    }
  ]
}

you'll see mostly self-descriptive properties, which build this JSON. The only thing - oldObjectId - is not so obvious. Basically it's SHA1 of the commit this commit is based on - since there're no commits yet, it's basicaly a string full of zeros.

Pushing data

Making an initial commit is a piecie of cake. What if we'd like to update an existing file? The main issue here is to find an oldObjectId, which is required to actually make a request successful. Once more the API becomes handy here - what we can do is to fetch a list list of all pushes and take the last one. Take a look at the signature from the documentation:

/
GET https://{instance}/DefaultCollection/_apis/git/repositories/{repository}/pushes?api-version={version}[&fromDate={dateTime}&toDate={dateTime}&pusherId={guid}&$skip={integer}&$top={integer}]

What is great about this request is the possibility to filter the data - we don't have to download all pushes, only those from the date interval, made by a specific pusher or maybe only the top N. The response gives us a list of pushes ordered from the newest to the oldest. That is important here is to pass includeRefUpdates=true parameter in the query string. This way we'll get following additional property in the response:

/
{
          "repositoryId": "04baf35b-faec-4619-9e42-ce2d0ccafa4c",
          "name": "refs/heads/master",
          "oldObjectId": "0000000000000000000000000000000000000000",
          "newObjectId": "5e108508e2151f5513fffaf47f3377eb6e571b20"
}

and we're able to refer to the newObjectId property to make an update. Once we have it, we can use once more the endpoint used to create an initial commit with a slightly modified body:

/
{
  "refUpdates": [
    {
      "name": "refs/heads/master",
      "oldObjectId": "5e108508e2151f5513fffaf47f3377eb6e571b20"
    }
  ],
  "commits": [
    {
      "comment": "Added a few more items to the task list.",
      "changes": [
        {
          "changeType": "edit",
          "item": {
            "path": "/readme.md"
          },
          "newContent": {
            "content": "Modified readme file!",
            "contentType": "rawtext"
          }
        }
      ]
    }
  ]
}

Once we post this request, a new commit should be pushed and visible when you access a repository.