mdlink: Linking multiple node modules with a single command

Introduction

As part of the process of working in node.js applications, developers usually use additional libraries and frameworks to ease and speed their development time, and more than often the number of libraries they rely upon is quite high.

It is also not uncommon to need to modify any of those libraries to fix a bug, or fork them to add additional functionality required for the project they are being used in.

The usual way to do this in a node.js project is:

  1. Clone the library's repository in your local machine
  2. cd your library's folder.
  3. sudo npm link to link the module globally.
  4. cd to your project's folder.
  5. npm link <library_name> to link to the library.

If you have multiple libraries that you need to be linking at the same time, or you have to link a library that depends on another library that you may need to link as well, it can be quite cumbersome to link them one by one.

In my free time I developed mdlink. It is a small utility that allows to easily npm link multiple node modules in a given project.

It was born as a reaction of my annoyance while I was developing Ember.js applications with multiple addons at the same time. Per each addon I'd need to clone it and link it. This tool intends to provide an easy way of bulk npm link and removing those links later while not necessary.

Modules are specified in the mdlink.config.json configuration file. e.g:

{
  "base_modules_path": "~/gits/test",
  "modules": {
    "mdlink": {
      "url": "https://github.com/fr0gs/mdlink",
      "path": "~/gits/test/mdlink"
    }
  }
}

Each module to be linked is stated in the modules object. It can have both url & path together, or only one of each.

The base_modules_path is used for the case where there is no path but a url is yet specified. The module will be cloned in base_modules_path/module_name instead.

Let's see an example with a new Ember application. First, download mdlink with npm and create the application:

test-ember:master* λ sudo npm install -g mdlink
test-ember:master* λ ember new test-ember

Then, create a new mdlink.config.json file inside the test-ember app.

test-ember:master* λ mdlink init

And let's say that in our new application we want to locally modify both ember-moment and ember-promise-helpers. We modify the mdlink.config.json file like this:

{
  "base_modules_path": "/home/esteban/gits/test",
  "modules": {
    "ember-moment": {
      "url": "https://github.com/stefanpenner/ember-moment",
      "path": "/home/esteban/gits/test/ember-moment"
    },
    "ember-promise-helpers": {
      "url": "https://github.com/fivetanley/ember-promise-helpers.git",
      "path": "/home/esteban/gits/test/ember-promise-helpers"
    }
  }
}

Do the linking:

test-ember:master* λ mdlink s
[+] Path /home/esteban/gits/test-ember/node_modules/ember-moment already exists, removing it.
[+] <path exists> . Successfully cloned https://github.com/stefanpenner/ember-moment in path: /home/esteban/gits/test/ember-moment
[+] Create link from /usr/lib/node_modules/ember-moment -> /home/esteban/gits/test/ember-moment
-------------------------
[+] <path exists> . Successfully cloned https://github.com/fivetanley/ember-promise-helpers.git in path: /home/esteban/gits/test/ember-promise-helpers
[+] Create link from /usr/lib/node_modules/ember-promise-helpers -> /home/esteban/gits/test/ember-promise-helpers

Verify both modules where properly cloned:

test λ ls -la ~/gits/test/
drwxrwxr-x  4 esteban esteban 4,0K Dez 19 22:07 ./
drwxrwxr-x  8 esteban esteban 4,0K Dez 19 22:05 ../
drwxrwxr-x 10 esteban esteban 4,0K Dez 19 22:36 ember-moment/
drwxrwxr-x  9 esteban esteban 4,0K Dez 19 22:07 ember-promise-helpers/

Modify index.js in both modules, and ember s from the test-ember app folder:

test-ember:master* λ ember s
Linked ember-moment
Linked ember-promise-helpers
Could not start watchman
Visit https://ember-cli.com/user-guide/#watchman for more info.
Livereload server on http://localhost:7020

Build successful (14117ms) – Serving on http://localhost:4200/

Have fun!

Ember.js cookies over SSL

Problem

I am currently working on an issue in a very old Ember.js application (version 1.13.0 of the ember-cli). After performing the regular ritual: npm install; bower install; ember s --proxy <proxy> I was presented with the application's login screen, but what was my surprise that after introducing a valid set of credentials, the next http call invariably failed.

Client -> POST /sessions -> Server API (works)
Client -> GET /settings -> Server API (fails)

Now, the code wasn't changed in the frontend so there was a slight chance the backend would be the culprit. Checking the differences between a working production environment two things stood out:

  • In the working environment both frontend & backend are in the same server. While developing with ember s I have a server running locally and I make calls to the external API, so this might be a problem of related with making cross-origin requests.
  • After being successfully authenticated, the server sends a session cookie using the Set-Cookie header, and in subsequent calls the browser will send that cookie in the request headers. Usual behavior. But in the failing environment, after a successful login, the browser received the cookie but never sent it back again.

I quickly discarded the first case, since when you use ember s --proxy you are really making a call to the same origin and then the proxy will route it to the external API. On the other side... why was my cookie not being set?! It was not until a coworker pointed out the Secure flag in the session cookie. What was that?

It turns out, as seen here, the Secure flag is set to prevent third parties to observe sensitive cookies being sent in clear text, hence, those cookies won't be resent unless they are received & sent over HTTPS, and the development server was running over regular HTTP.

Solution

The solution was indeed quite simple! The ember-cli provides an easy way to serve traffic over SSL using a personal certificate. For developing purposes, I just created a self-signed certificate following the instructions in the heroku dev center. After that you would have a server.key and server.crt key & certificate.

Then you can run the ember-cli specifying the ssl options in .ember-cli:

{
  "disableAnalytics": false,
  "ssl": true,
  "ssl-key": "path/to/server.key",
  "ssl-cert": "path/to/server.crt"
}

By default ember will look into the ssl/ folder for the server.key and the server.crt files, so creating those files inside the folder and running

$ ember s --ssl=true --proxy <proxy>

Will make the thing run.

Have fun!

Array of extended objects in python using list comprehensions and lambda functions.

Problem

It's been a while since I don't write any posts, so I thought that even though the idea might be initially quite silly, it will help me to kickoff again the habit by writing about a small problem I encountered the other day.

I was developing a very simple microservice that would receive a GET request with two parameters, issue a SPARQL query to a Virtuoso store and then, transform the returned array of objects by extending each object with the same additional meta information per object. Say:

res = [{ 'title': 'Oh boy' }, { 'title': 'Oh girl'}]

And then add some additional metadata like { 'meta': { 'author': 'Myself'}}

Ending up with

res = [ {
        'title': 'Oh boy',
        'meta':  {
          'author': 'Myself'
          }
        },
        {
          'title': 'Oh girl',
          'meta': {
            'author': 'Myself'
          }
        }]

Solution

I wanted to do something self contained and as functional as possible, by using list comprehensions for example. Unfortunately, there is no method in python to update a dictionary and return the new dictionary updated. The regular way is like:

a = { 'b': 3 }
a.update({'c': 5}) # Dict updated, does not return anything
print(a) # {'c': 5, 'b': 3}

Ultimately I came up with a small solution:

result = [(lambda x, y=z.copy(): (y.update(x), y))({ 'meta': { 'author': 'Myself' } })[1] for z in res]

Tada! Combining list comprehensions, lambda functions and the built-in dictionary copy() function we can return a new array with a copy of each object already extended.

By using a lambda function that accepts a tuple we can specify that the first argument is passed as a parameter and the second one will be a copy of each element in the array (assuming it is an object). Then, the object is extended with the argument and the newly extended object is returned as the second element of the tuple.

We could even bake this into a function:

def map_extend(array=[], ext={}):
  return [(lambda x, y=z.copy(): (y.update(x), y))(ext)[1] for z in array]
>>> res
[{'title': 'Oh boy'}, {'title': 'Oh girl'}]
>>> ext = { 'meta': {'author': 'Hola'}}                                                     
>>> map_extend(res, ext)
[{'meta': {'author': 'Hola'}, 'title': 'Oh boy'}, {'meta': {'author': 'Hola'}, 'title': 'Oh girl'}]
>>> map_extend(res, {})                                                                     
[{'title': 'Oh boy'}, {'title': 'Oh girl'}]
>>> map_extend([], {})                                                                      
[]

Have fun!

Clojure threading macros in ES6

While working in the functionality for getting a docker-compose path from the cursor position, I realized at some point I was writing constantly code like this:

let first_result = func_call1(val);
let second_result = func_call2(first_result);
let third_result = func_call3(second_result);

...etc...

This is not necessarily bad, it improves code readability and helps to reason about the flow of execution, normally better than calling those functions in a nested way:

func_call3(func_call2(func_call1(val))); // Phew.

But yet, it feels somewhat cumbersome to nest too many function calls that way. Some time ago I started looking into Clojure for fun and discovered threading macros. Thread first ('->') and Thread last(->>) macros pipe the result of the application of a function to a value to the next function in the list. The difference is that thread first adds the result as the first argument to the next function and thread last to the last one.

Wouldn't it be nice to have some kind of mechanism in Javascript?. Looking around I found someone had already written a nice article about it. The target is to have a function to be called like this:

let result = thread("->", "3", 
                        parseInt,
                        [sum, 3],
                        [diff, 10],
                        str); // "-4"

And what would be going on under the hood is this:

let sameresult = diff(sum(3, parseInt("3")), 10);

So I decided to reimplement the code taking advantage of the new features that came along with ES6(arrow functions, destructuring, rest parameters...).

const thread = (operator, first, ...args) => {
    let isThreadFirst;
    switch (operator) {
        case '->>':
            isThreadFirst = false;
            break
        case '->':
            isThreadFirst = true;
            break;
        default:
            throw new Error('Operator not supported');
            break;
    }
    return args.reduce((prev, next) => {
        if (Array.isArray(next)) {
            const [head, ...tail] = next;
            return isThreadFirst ? head.apply(this, [prev, ...tail]) : head.apply(this, tail.concat(prev));
        }
        else {
            return next.call(this, prev);
        }
    }, first);
}

So when executing the code using the thread first operator for example:

let result = thread("->", "3", 
                        parseInt,
                        [sum, 3],
                        [diff, 10],
                        str); // "-4"

console.log(result); // -4 

and using the thread last operator:

let result = thread("->>", "3", 
                        parseInt,
                        [sum, 3],
                        [diff, 10],
                        str); // "-4"

console.log(result); // 4 

Have fun!

Links:

Getting docker-compose path from cursor position

Introduction

The Stack Builder application as part of the Big Data Europe platform is a system that helps in the process of building docker-compose.yml files. You can drag & drop existing docker-compose files from the project into a textarea to be shown and also search for other repositories in the big data europe github organization, composing a whole new system by taking useful pieces of systems and putting them together.

stackbuilder1

Additionally, it provides small hinting functionality for example by showing a dropdown menu with already existing containers to link them. The idea is to add more intelligence to this hinting process, by being able to know the context on where the user is situated while editing the docker-compose file, and therefore knowing what kind of information may be suitable for them.

  web:
    image: nginx
    volumes:
      - ./nginx.conf:/etc/nginx/nginx.conf:ro
    command: [nginx-debug, '-g', 'daemon off;']

Say that the user has the cursor in the - ./nginx.conf:/etc/nginx/nginx.conf:ro line. If we know that the user is situated in the web.volumes path we can add hints into additional volume mount paths that are commonly used for nginx containers.

The problem is, how do we know where in the docker-compose.yml file is the cursor placed?

Implementation

To see all the code just check the repository, I will simplify those pieces that are not needed. The initial scenario is simple: the docker-compose file is loaded into a textarea and parsed into a yaml object:

<div>
  <div class="input-field">
    {{textarea id="textarea-autocomplete" value=value label=label}}
    <label id="textarea-label-{{label}}" >{{label}}</label>
  </div>
</div>
yamlObject: Ember.computed('value', function() {
  try {
    const yaml = this.yamlParser(this.get('value'));
    this.setProperties({
      yamlErrorMessage: '',
      yamlError: false
    });
    return yaml;
  }
  catch (err) {
    this.setProperties({
      yamlErrorMessage: err,
      yamlError: true
    });
    return null;
  }
})

This will return a javascript object with the parsed YAML. Now, every time the cursor moves in the textarea either by pressing arrow keys or writing into the textarea we want to know the path in the yaml object where it is placed, for example giving a point-separated path, (i.e: if the cursor is placed in the first link of the identifier service: services.identifier.links.0).

The first thing we need is to have a way of getting the line where the cursor is placed (for example, - identifier:identifier inside a links object). Since the whole docker-compose.yml is stored as a string inside the textarea, a way of doing it is getting the "context string" starting from the cursor's position and adding characters both left and right until you find "stop characters", involving those that represent a line break or a tabulation in the YAML file.

getCursorYmlPath() {
  const text = this.get('value');
  const cursorPosition = Ember.$('#textarea-autocomplete').prop("selectionStart");
  const stringLeft = this.stringPad('left');
  const stringRight = this.stringPad('right');
  const contextString = `${stringLeft(text, cursorPosition).text.trim()}${stringRight(text, cursorPosition).text.trim()}`;
}

Function stringPad returns the padding characters of a string starting from the cursor index until it finds a stop character.

stringPad(direction, write) {
  return function (text, cursor) {
    let stopChars = ['\n', '\t'];
    let i = cursor;
    let predicate = write ? () => stopChars.indexOf(text[i-1]) : () => stopChars.indexOf(text[i]);
    while (predicate() === -1 && i > 0 && i < text.length) {
      if (direction === 'right') {
        i = i + 1;
      }
      else if (direction === 'left') {
        i = i - 1;
      }
      else {
        break;
      }
    }
    if (direction === 'right') {
      return {
        text: text.slice(cursor, i),
        index: i
      };
    }
    else if (direction === 'left') {
      return {
        text: text.slice(i, cursor),
        index: i
      };
    }
    else {
      return { text: "", index: -1 };
    }
  };
}

At the end, printing the contextString you get the whole line: "- dispatcher:dispatcher".

The next step is to know where in the docker-compose.yml you can find the contextString. Since you can find the previous mentioned line in several services inside a docker-compose, I create a list of object paths that have the context string as a match:

Array.prototype.flatten = function() {
  let arr = this;
  while (arr.find(el => Array.isArray(el))) { arr = Array.prototype.concat(...arr); }
  return arr;
};

getCursorYmlPath() {
  (...prev...)
  const pathMatches = this.getYmlPathMatches(contextString, this.get('yamlObject')).flatten();
}


getYmlPathMatches(contextString, yaml, currentPath) {
if (yaml && yaml !== null) {
  var currentPath = currentPath || "root";

  return Object.keys(yaml).map((key) => {
    if (typeof yaml[key] === "object" && yaml[key] !== null) {
      if (contextString.includes(key)) {
        return [`${currentPath}.${key}`].concat(this.getYmlPathMatches(contextString, yaml[key], `${currentPath}.${key}`));
      }          
      else {
        return this.getYmlPathMatches(contextString, yaml[key], `${currentPath}.${key}`);
      }
    }
    else {
      // Key is not of numeric type (so we are not inside an array)
      if (isNaN(key)) {
        if (contextString.includes(key) || contextString.includes(yaml[key])) {
          return `${currentPath}.${key}`;
        }
        else return [];
      }
      else {
        if (contextString.includes(yaml[key])) {
          return `${currentPath}.${key}`;
        }
        else return [];
      }
    }
  });
}
else return [];
}

Using root as the root object path, the result is a list of object paths like this:

["root.services.identifier.links.0", "root.services.dispatcher"]

Lastly, I retrieve the index in the pathMatches array that correspond to the closest match to the cursor's position.

getCursorYmlPath() {
  (...prev...)
  const tramo = text.length / pathMatches.length;
  const probableIndex = Math.floor(cursorPosition / tramo);
  return pathMatches[probableIndex];
}

There may be edge cases that I have not taken into account, but so far it is working nicely.

Have fun!