mdlink: Linking multiple node modules with a single command

Introduction

As part of the process of working in node.js applications, developers usually use additional libraries and frameworks to ease and speed their development time, and more than often the number of libraries they rely upon is quite high.

It is also not uncommon to need to modify any of those libraries to fix a bug, or fork them to add additional functionality required for the project they are being used in.

The usual way to do this in a node.js project is:

  1. Clone the library's repository in your local machine
  2. cd your library's folder.
  3. sudo npm link to link the module globally.
  4. cd to your project's folder.
  5. npm link <library_name> to link to the library.

If you have multiple libraries that you need to be linking at the same time, or you have to link a library that depends on another library that you may need to link as well, it can be quite cumbersome to link them one by one.

In my free time I developed mdlink. It is a small utility that allows to easily npm link multiple node modules in a given project.

It was born as a reaction of my annoyance while I was developing Ember.js applications with multiple addons at the same time. Per each addon I'd need to clone it and link it. This tool intends to provide an easy way of bulk npm link and removing those links later while not necessary.

Modules are specified in the mdlink.config.json configuration file. e.g:

{
  "base_modules_path": "~/gits/test",
  "modules": {
    "mdlink": {
      "url": "https://github.com/fr0gs/mdlink",
      "path": "~/gits/test/mdlink"
    }
  }
}

Each module to be linked is stated in the modules object. It can have both url & path together, or only one of each.

The base_modules_path is used for the case where there is no path but a url is yet specified. The module will be cloned in base_modules_path/module_name instead.

Let's see an example with a new Ember application. First, download mdlink with npm and create the application:

test-ember:master* λ sudo npm install -g mdlink
test-ember:master* λ ember new test-ember

Then, create a new mdlink.config.json file inside the test-ember app.

test-ember:master* λ mdlink init

And let's say that in our new application we want to locally modify both ember-moment and ember-promise-helpers. We modify the mdlink.config.json file like this:

{
  "base_modules_path": "/home/esteban/gits/test",
  "modules": {
    "ember-moment": {
      "url": "https://github.com/stefanpenner/ember-moment",
      "path": "/home/esteban/gits/test/ember-moment"
    },
    "ember-promise-helpers": {
      "url": "https://github.com/fivetanley/ember-promise-helpers.git",
      "path": "/home/esteban/gits/test/ember-promise-helpers"
    }
  }
}

Do the linking:

test-ember:master* λ mdlink s
[+] Path /home/esteban/gits/test-ember/node_modules/ember-moment already exists, removing it.
[+] <path exists> . Successfully cloned https://github.com/stefanpenner/ember-moment in path: /home/esteban/gits/test/ember-moment
[+] Create link from /usr/lib/node_modules/ember-moment -> /home/esteban/gits/test/ember-moment
-------------------------
[+] <path exists> . Successfully cloned https://github.com/fivetanley/ember-promise-helpers.git in path: /home/esteban/gits/test/ember-promise-helpers
[+] Create link from /usr/lib/node_modules/ember-promise-helpers -> /home/esteban/gits/test/ember-promise-helpers

Verify both modules where properly cloned:

test λ ls -la ~/gits/test/
drwxrwxr-x  4 esteban esteban 4,0K Dez 19 22:07 ./
drwxrwxr-x  8 esteban esteban 4,0K Dez 19 22:05 ../
drwxrwxr-x 10 esteban esteban 4,0K Dez 19 22:36 ember-moment/
drwxrwxr-x  9 esteban esteban 4,0K Dez 19 22:07 ember-promise-helpers/

Modify index.js in both modules, and ember s from the test-ember app folder:

test-ember:master* λ ember s
Linked ember-moment
Linked ember-promise-helpers
Could not start watchman
Visit https://ember-cli.com/user-guide/#watchman for more info.
Livereload server on http://localhost:7020

Build successful (14117ms) – Serving on http://localhost:4200/

Have fun!

Clojure threading macros in ES6

While working in the functionality for getting a docker-compose path from the cursor position, I realized at some point I was writing constantly code like this:

let first_result = func_call1(val);
let second_result = func_call2(first_result);
let third_result = func_call3(second_result);

...etc...

This is not necessarily bad, it improves code readability and helps to reason about the flow of execution, normally better than calling those functions in a nested way:

func_call3(func_call2(func_call1(val))); // Phew.

But yet, it feels somewhat cumbersome to nest too many function calls that way. Some time ago I started looking into Clojure for fun and discovered threading macros. Thread first ('->') and Thread last(->>) macros pipe the result of the application of a function to a value to the next function in the list. The difference is that thread first adds the result as the first argument to the next function and thread last to the last one.

Wouldn't it be nice to have some kind of mechanism in Javascript?. Looking around I found someone had already written a nice article about it. The target is to have a function to be called like this:

let result = thread("->", "3", 
                        parseInt,
                        [sum, 3],
                        [diff, 10],
                        str); // "-4"

And what would be going on under the hood is this:

let sameresult = diff(sum(3, parseInt("3")), 10);

So I decided to reimplement the code taking advantage of the new features that came along with ES6(arrow functions, destructuring, rest parameters...).

const thread = (operator, first, ...args) => {
    let isThreadFirst;
    switch (operator) {
        case '->>':
            isThreadFirst = false;
            break
        case '->':
            isThreadFirst = true;
            break;
        default:
            throw new Error('Operator not supported');
            break;
    }
    return args.reduce((prev, next) => {
        if (Array.isArray(next)) {
            const [head, ...tail] = next;
            return isThreadFirst ? head.apply(this, [prev, ...tail]) : head.apply(this, tail.concat(prev));
        }
        else {
            return next.call(this, prev);
        }
    }, first);
}

So when executing the code using the thread first operator for example:

let result = thread("->", "3", 
                        parseInt,
                        [sum, 3],
                        [diff, 10],
                        str); // "-4"

console.log(result); // -4 

and using the thread last operator:

let result = thread("->>", "3", 
                        parseInt,
                        [sum, 3],
                        [diff, 10],
                        str); // "-4"

console.log(result); // 4 

Have fun!

Links:

Getting docker-compose path from cursor position

Introduction

The Stack Builder application as part of the Big Data Europe platform is a system that helps in the process of building docker-compose.yml files. You can drag & drop existing docker-compose files from the project into a textarea to be shown and also search for other repositories in the big data europe github organization, composing a whole new system by taking useful pieces of systems and putting them together.

stackbuilder1

Additionally, it provides small hinting functionality for example by showing a dropdown menu with already existing containers to link them. The idea is to add more intelligence to this hinting process, by being able to know the context on where the user is situated while editing the docker-compose file, and therefore knowing what kind of information may be suitable for them.

  web:
    image: nginx
    volumes:
      - ./nginx.conf:/etc/nginx/nginx.conf:ro
    command: [nginx-debug, '-g', 'daemon off;']

Say that the user has the cursor in the - ./nginx.conf:/etc/nginx/nginx.conf:ro line. If we know that the user is situated in the web.volumes path we can add hints into additional volume mount paths that are commonly used for nginx containers.

The problem is, how do we know where in the docker-compose.yml file is the cursor placed?

Implementation

To see all the code just check the repository, I will simplify those pieces that are not needed. The initial scenario is simple: the docker-compose file is loaded into a textarea and parsed into a yaml object:

<div>
  <div class="input-field">
    {{textarea id="textarea-autocomplete" value=value label=label}}
    <label id="textarea-label-{{label}}" >{{label}}</label>
  </div>
</div>
yamlObject: Ember.computed('value', function() {
  try {
    const yaml = this.yamlParser(this.get('value'));
    this.setProperties({
      yamlErrorMessage: '',
      yamlError: false
    });
    return yaml;
  }
  catch (err) {
    this.setProperties({
      yamlErrorMessage: err,
      yamlError: true
    });
    return null;
  }
})

This will return a javascript object with the parsed YAML. Now, every time the cursor moves in the textarea either by pressing arrow keys or writing into the textarea we want to know the path in the yaml object where it is placed, for example giving a point-separated path, (i.e: if the cursor is placed in the first link of the identifier service: services.identifier.links.0).

The first thing we need is to have a way of getting the line where the cursor is placed (for example, - identifier:identifier inside a links object). Since the whole docker-compose.yml is stored as a string inside the textarea, a way of doing it is getting the "context string" starting from the cursor's position and adding characters both left and right until you find "stop characters", involving those that represent a line break or a tabulation in the YAML file.

getCursorYmlPath() {
  const text = this.get('value');
  const cursorPosition = Ember.$('#textarea-autocomplete').prop("selectionStart");
  const stringLeft = this.stringPad('left');
  const stringRight = this.stringPad('right');
  const contextString = `${stringLeft(text, cursorPosition).text.trim()}${stringRight(text, cursorPosition).text.trim()}`;
}

Function stringPad returns the padding characters of a string starting from the cursor index until it finds a stop character.

stringPad(direction, write) {
  return function (text, cursor) {
    let stopChars = ['\n', '\t'];
    let i = cursor;
    let predicate = write ? () => stopChars.indexOf(text[i-1]) : () => stopChars.indexOf(text[i]);
    while (predicate() === -1 && i > 0 && i < text.length) {
      if (direction === 'right') {
        i = i + 1;
      }
      else if (direction === 'left') {
        i = i - 1;
      }
      else {
        break;
      }
    }
    if (direction === 'right') {
      return {
        text: text.slice(cursor, i),
        index: i
      };
    }
    else if (direction === 'left') {
      return {
        text: text.slice(i, cursor),
        index: i
      };
    }
    else {
      return { text: "", index: -1 };
    }
  };
}

At the end, printing the contextString you get the whole line: "- dispatcher:dispatcher".

The next step is to know where in the docker-compose.yml you can find the contextString. Since you can find the previous mentioned line in several services inside a docker-compose, I create a list of object paths that have the context string as a match:

Array.prototype.flatten = function() {
  let arr = this;
  while (arr.find(el => Array.isArray(el))) { arr = Array.prototype.concat(...arr); }
  return arr;
};

getCursorYmlPath() {
  (...prev...)
  const pathMatches = this.getYmlPathMatches(contextString, this.get('yamlObject')).flatten();
}


getYmlPathMatches(contextString, yaml, currentPath) {
if (yaml && yaml !== null) {
  var currentPath = currentPath || "root";

  return Object.keys(yaml).map((key) => {
    if (typeof yaml[key] === "object" && yaml[key] !== null) {
      if (contextString.includes(key)) {
        return [`${currentPath}.${key}`].concat(this.getYmlPathMatches(contextString, yaml[key], `${currentPath}.${key}`));
      }          
      else {
        return this.getYmlPathMatches(contextString, yaml[key], `${currentPath}.${key}`);
      }
    }
    else {
      // Key is not of numeric type (so we are not inside an array)
      if (isNaN(key)) {
        if (contextString.includes(key) || contextString.includes(yaml[key])) {
          return `${currentPath}.${key}`;
        }
        else return [];
      }
      else {
        if (contextString.includes(yaml[key])) {
          return `${currentPath}.${key}`;
        }
        else return [];
      }
    }
  });
}
else return [];
}

Using root as the root object path, the result is a list of object paths like this:

["root.services.identifier.links.0", "root.services.dispatcher"]

Lastly, I retrieve the index in the pathMatches array that correspond to the closest match to the cursor's position.

getCursorYmlPath() {
  (...prev...)
  const tramo = text.length / pathMatches.length;
  const probableIndex = Math.floor(cursorPosition / tramo);
  return pathMatches[probableIndex];
}

There may be edge cases that I have not taken into account, but so far it is working nicely.

Have fun!

Routing in Javascript

Introduction

Originally web applications consisted in interconnected html documents that one could navigate through links between them. Every time a user clicked a link on a website a new document would be generated in the server and sent back to the browser to be rendered in their screen.

Around the year 2005 the term Single-Page Application (SPA) became popular. Said term encompassed a new way or architecting websites to make them behave more like desktop applications: snappy, with graphical animations and smooth transitions between links.
This was achieved by taking advantage of javascript, html & css, as new APIs became available to give the browser more native-like capabilities.

SPAs are based on a single document model. This means that web applications' lifespan happens on a single html page, along with the transitions between the different views. But since links no longer imply the fetching and generation of a new document, how are those transitions modelled? they are achieved by using a router.

What is a Javascript Router?

A Javascript router is a key component in most frontend frameworks. It is the piece of software in charge to organize the states of the application, switching between different views. For example, the router will render the login screen initially, and when the login is successfull it will perform the transition to the user's welcome screen.

How it works.

The router will be in charge of simulating transitions between documents by watching changes on the URL. When the document is reloaded or the URL is modified somehow, it will detect that change and render the view that is associated with the new URL.

I wrote a small router in javascript to illustrate the idea. At the beginning we need two objects, one to store the routes, and other to store the templates, along with two simple functions to register them.

Templates are just one way of describing the DOM that will be generated when the transition from one route to the other is completed. The whole javascript application will live in a div element.

// Application div
const appDiv = "app";

// Both set of different routes and template generation functions
let routes = {};
let templates = {};

// Register a template (this is to mimic a template engine)
let template = (name, templateFunction) => {
  return templates[name] = templateFunction;
};

// Define the routes. Each route is described with a route path & a template to render
// when entering that path. A template can be a string (file name), or a function that
// will directly create the DOM objects.
let route = (path, template) => {
    if (typeof template == "function") {
      return routes[path] = template;
    }
    else if (typeof template == "string") {
      return routes[path] = templates[template];
    }
    else {
      return;
    }
};

Now we will be able to register templates and routes, creating the mapping between them:

// Register the templates.
template('template1', () => {
    let myDiv = document.getElementById(appDiv);
    myDiv.innerHTML = "";
    const link1 = createLink('view1', 'Go to view1', '#/view1');
    const link2 = createLink('view2', 'Go to view2', '#/view2');

    myDiv.appendChild(link1);
    return myDiv.appendChild(link2);
});

template('template-view1', () => {
    let myDiv = document.getElementById(appDiv);
    myDiv.innerHTML = "";
    const link1 = createDiv('view1', "<div><h1>This is View 1 </h1><a href='#/'>Go Back to Index</a></div>");
    return myDiv.appendChild(link1);
});

template('template-view2', () => {
    let myDiv = document.getElementById(appDiv);
    myDiv.innerHTML = "";
    const link2 = createDiv('view2', "<div><h1>This is View 2 </h1><a href='#/'>Go Back to Index</a></div>");
    return myDiv.appendChild(link2);
});


// Define the mappings route->template.
route('/', 'template1');
route('/view1', 'template-view1');
route('/view2', 'template-view2');

For the templates we match a template name with a function that will generate javascript elements and append the resulting DOM to the div where the application lives. This functionality in a real router would be taken over by the templating engine. For the routes, we just do the mapping between a route path and the corresponding template.

The createLink & createDiv are auxiliary functions to generate DOM:

// Generate DOM tree from a string
let createDiv = (id, xmlString) => {
    let d = document.createElement('div');
    d.id = id;
    d.innerHTML = xmlString;
    return d.firstChild;
};

// Helper function to create a link.
let createLink = (title, text, href) => {
    let a = document.createElement('a');
    let linkText = document.createTextNode(text);
    a.appendChild(linkText);
    a.title = title;
    a.href = href;
    return a;
};

What is left is to have the logic to detect changes in the URL and resolve them to render the template. To do so, listen for the load & hashchange events. The former fires then a document is loaded, and the latter when the URL hash changes.

// Give the correspondent route (template) or fail
let resolveRoute = (route) => {
    try {
     return routes[route];
    } catch (error) {
        throw new Error("The route is not defined");
    }
};

// The actual router, get the current URL and generate the corresponding template
let router = (evt) => {
    const url = window.location.hash.slice(1) || "/";
    const routeResolved = resolveRoute(url);
    routeResolved();
};

// For first load or when routes are changed in browser url box.
window.addEventListener('load', router);
window.addEventListener('hashchange', router);

That's it! Of course many functionality is lacking: the use of controllers to transform data before passing it to the views, nested routes, the use of history api, etc.. but the idea of javascript routing is quite easy to grasp. The code together can be found in this gist.

Examples

References

Have fun!

My experience in JSCONF Belgium 2017.

This Thursday 29th June the Jsconf.be conference was celebrated in the beautiful city of Brugge, and I had the chance of going there! I had already taken a look at the speakers and I was interested in at least four talks so it was more than worth going.

The conferences started in the afternoon and where structured in two tracks not following any topic in particular, so I had to sacrifice some.

Keynote

There was a big delay on the trains so I couldn't watch this piece in it's whole, the speaker Peter Paul Koch was already halfway through his talk when I arrived. The remaining of the keynote was extremely interesting.

He talked about the all well known obesity of the web, pinpointing how the indiscriminated use of frameworks, build tools, libraries, was contaminating the web development environment, generating heavier and heavier websites that would take several seconds to download and render in high end devices, making almost impossible to access these sites to a big part of the world's population, having access to low end devices, unstable and sloppy network connections and not the highest throughput environments that we are used to in first world countries.

All this bloat is partially originated due to the advent of web developers trying to emulate the fully functionality of native desktop applications using web technologies.

Also, the uncertainty that frontend developers have to deal with when developing software in such an aggressive and hostile environment as the web was brought to the table. Maybe the reason why the so called javascript fatigue is partly originated because we frontend developers try to be taken seriously by overcreating tooling and patterns that exponentially increase the complexity of software.

Every idea was reflected in a deep and encouraging attitude. Far from being a rant, it was inspiring and full of hope towards the web. I very much enjoyed it I am already following the speaker to see what he will be up to.

Reactive Programming By Example.

A high level introduction to Reactive Programming by a couple of speakers: Lander Verhack and his boss. Using a funny and engaging way of question-answer format between them they gave a very simple overview on what is reactive programming by creating a fake page that would provide a simple search engine to query the Spotify API and show a list of songs, artists and albums.

At work now I work primarily with Ember.js, so the concepts explained in this talk reminded me very closely of the ember observers and computed properties, built on top of observers.

The Era of Module Bundlers.

Arun DSouza provided a walkthrough accross task runners, build tools, and module bundlers, enumerating the most commonly used ones, either today or historically: grunt, gulp, browserify and webpack were several examples.

At the end he focused on webpack for being the most complete tool, incorporating task running, bundling, minifying, and several other tasks in a single environment. Despite of being intersting, this talk did not provide me anything new other than learning some specifics of Webpack in order to do this or that task.

How AI saved my life.

I really really enjoyed this talk. Nick Trogh, evangelist in Microsoft BeLux gave an introduction to the super cool Microsoft Cognitive Services API in Azure and using a website gave examples on how users can interact with this API.

The first example made use of the Text Analytics API to extract information about the sentiment, the subjective opinion of the emitter of the text. Being closer to zero negative and to one positive.

Then the Computer Vision API came along. Providing an initial small set of example images from a football team, the API detected if a new person belonged to that football team, also their age, emotion (Emotion API), gender, and a range of other parameters. All with a very close precision. It was amazing, there is a lot of potential on opening this kind of services to be used by third parties.

Enterprise Javascript.

I think one of my tweets pretty much summarizes the general idea: Oracle bundled jQuery+knockout.js+require.js+cordova = boom, OracleJET. We will see the usage in the future!

Yet another frontend javascript framework (Peter Koch warned us!) to add to our collection. Oracle jumped on the frontend train with an open source hotchpotch of existing technologies wanting to generate community around it.

It has several interesting points: it wants to adapt Web Components, it is responsive by default, it provides internationalization out of the box... Only time will say where this new tool falls in.

How I hacked my coffee machine with JS.

The time the speker spent talking was soaking wet in pure awesomeness. Dominik Kundel explained the empowering process of being initially bored and deciding to crack open your flatmate's coffee machine to reverse engineer the microcontroller and learn how the buttons worked and communicated within.

Once having done that he could hook a Tessel microcontroller to the machine and develop a small service to allow him to start brewing coffee remotely.

Really cool home hacking project!

Conclusions.

The overall level of the talks was introductory but I enjoyed almost all the talks very much. The one explaining Microsoft Cognitive Services API and the Coffee Machine Hacking were on the top, and also sparked my interest in Reactive Programming, I will have to look into that too.

Overall, it has been fun and enriching to attend, I bring several ideas to test at home. As a fun fact, looking around I didn't find a single linux user in the whole conference, I think I just saw a guy with emacs open! :P