Prusa Mini: programmatically upload files via curl bash script

http api ethernet 3d printing prusa mini+

Thanks to the recent v4.4.1 BuddyBoard firmware the http file api works as desired: you can easily upload files to a usb stick attached to the printer. To perform bulk updates of your printer farm it’s much easier to write a simple bash script which deploys the print jobs:

#!/usr/bin/env bash

set -e

# printer settings
PRINTER_HOST="192.168.1.123"
API_KEY="ToEn8eDlR7kWIiUpVPJg"
FILENAME=myfile.gcode

# capture command stdout - http status code will be written to stdout
# progress bar on stderr
# http response (json) stored in /tmp/.upload-response
CURL_HTTP_STATUS=$(curl \
    --header "X-Api-Key: ${API_KEY}" \
    -F "file=@${FILENAME}" \
    -F "path=" \
    -X POST \
    -o /tmp/.upload-response \
    --write-out "%{http_code}" \
    http://${PRINTER_HOST}/api/files/local
)

# get result
CURL_EXITCODE=$?
CURL_RESPONSE=$(cat /tmp/.upload-response)

# success ?
if [ ${CURL_EXITCODE} -ne 0 ] || [ "${CURL_HTTP_STATUS}" -ne "201" ]; then
    echo "error: upload failed (${CURL_HTTP_STATUS})"
else
    echo "upload succeed"
fi

Uploading multiple files and checksums via http can be achieved with cURL and a few lines bash scripting. This might replace scp in most cases.

# array of files (and checksums) provided as cURL options
UPLOAD_FILES=()

# get all files within myUploadDir dir and calculate checksums
while read -r FILE
do
    # get sha256 checksum
    CHECKSUM=$(sha256sum ${FILE} | awk '{print $1}' )
    echo $FILE
    echo $CHECKSUM

    # extract filename
    FILENAME=$(basename ${FILE})

    # append file and checksum to curl upload args
    UPLOAD_FILES+=("-F" "file=@${FILE}") 
    UPLOAD_FILES+=("-F" "${FILENAME}=${CHECKSUM}")

# get all files within myUploadDir
done <<<$(find myUploadDir/* -type f | sort)

# upload
curl \
     -X PUT -H "Content-Type: multipart/form-data" \
     "${UPLOAD_FILES[@]}" \
     https://httpbin.org/put

Hetzner Cloud: Predictable Network Interface Names

ens3 ens10 ens11 ens12 enp1s0 enp7s0 enp8s0 enp9s0

With the release of the new AMD EPYC based cloud servers (CPX), Hetzner has applied some changes to their virtualization platform (QEMU). The network interface names have changed due to the modern virtio_net network adapter 0x1041 including different pcie bus addresses. All Hetzner standard images are now using the net.ifnames=0 setting to enforce the kernel […]

Traefik: tls private key does not match public key

self signed certificates, combined pem

In case you’re using self-signed x509 certificates you may see this error message within the traefik logs – the solution is quite easy: the first certificate of your combined pem file (ca+intermediate+server) has to be the server certificate!

Using ejs as template-engine within express.js default configuration can be very annoying – you have to pass a dedicated variable set to each response.render() call. But for a lot of tasks it is required to use some kind of global variables in your templates, e.g. page title, resources and much more.

The most reliable solution is a custom template renderer which invokes ejs in the way you want.

Custom Template Engine/Renderer Function#

const _ejs = require('ejs');

// example: global config
const _config = require('../config.json');

// custom ejs render function
module.exports = function render(filename, payload={}, cb){
    // some default page vars
    payload.page = payload.page || {};
    payload.page.slogan = payload.page.slogan || _config.slogan;
    payload.page.title = payload.page.title || _config.title;
    payload.page.brandname = payload.page.brandname || _config.name;

    // resources
    payload.resources = payload.resources || {};

    // render file
    // you can also pass some ejs lowlevel options
    _ejs.renderFile(filename, payload, {
        
    }, cb);
}

Usage#

const _express = require('express');
const _webapp = _express();
const _path = require('path');
const _tplengine = require('./my-template-engine');

// set the view engine to ejs
_webapp.set('views', _path.join(__dirname, '../views'));
_webapp.engine('ejs', _tplengine);
_webapp.set('view engine', 'ejs');

// your controller
_webapp.get('/', function(req, res){
   // render the view using additional variables
   res.render('myview', {
     x: 1,
     y: 2
   });
});

 

Use EnllighterJS with marked

markdown, gfm, javascript, nodejs

marked is one of the most popular markdown parsers written in javascript. It’s quite easy to integrate EnlighterJS within, just pass a custom highlight function as option.

Promised based highlighting#

File: markdown.js

const _marked = require('marked');
const _renderer = new _marked.Renderer();

// escape html specialchars
function escHtml(s){
    return s.replace(/&/g, '&amp;')
            .replace(/"/g, '&quot;')
            .replace(/</g, '&lt;')
            .replace(/>/g, '&gt;');
}

// EnlighterJS Codeblocks
_renderer.code = function(code, lang){
    return `<pre data-enlighter-language="${lang}">${escHtml(code)}</pre>`;
};

const _options = {
    // gfm style line breaks
    breaks: true,

    // custom renderer
    renderer: _renderer
};

// promise proxy
function render(content){
    return new Promise(function(resolve, reject){
        // async rendering
        _marked(content, _options, function(e, html){
            if (e){
                reject(e);
            }else{
                resolve(html);
            }
        });
    });
}

module.exports = {
    render: render
};

 

Usage#

const _markdown = require('markdown');

// fetch markdown based content
const rawCode = getMarkdownContent(..);

// render content
const html = await _markdown.render(rawCode);

 

Comparing the content of two directories binary-safe is a common used feature especially for data synchronization tasks. You can easily implement a simple compare algorithm by generating the sha256 checksums of each file – this is not a high-performance solution but even works on large files!

const _fs = require('fs-magic');

// compare directoy contents based on sha256 hash tables
async function compareDirectories(dir1, dir2){
    // fetch file lists
    const [files1, dirs1] = await _fs.scandir(dir1, true, true);
    const [files2, dirs2] = await _fs.scandir(dir2, true, true);

    // num files, directories equal ?
    if (files1.length != files2.length){
        throw new Error('The directories containing a different number of files ' + files1.length + '/' + files2.length);
    }
    if (dirs1.length != dirs2.length){
        throw new Error('The directories containing a different number of subdirectories ' + dirs1.length + '/' + dirs2.length);
    }

    // generate file checksums
    const hashes1 = await Promise.all(files1.map(f => _fs.sha256file(f)));
    const hashes2 = await Promise.all(files2.map(f => _fs.sha256file(f)));

    // convert arrays to objects filename=>hash
    const lookup = {};
    for (let i=0;i<hashes2.length;i++){
        // normalized filenames
        const f2 = files2[i].substr(dir2.length);
        
        // assign
        lookup[f2] = hashes2[i];
    }

    // compare dir1 to dir2
    for (let i=0;i<hashes1.length;i++){
        // normalized filenames
        const f1 = files1[i].substr(dir1.length);

        // exists ?
        if (!lookup[f1]){
            throw new Error('File <' + files1[i] + '> does not exist in <' + dir2 + '>');
        }

        // hash valid ?
        if (lookup[f1] !== hashes1[i]){
            throw new Error('File Checksum of <' + files1[i] + '> does not match <' + files2[i] + '>');
        }
    }

    return true;
}

await compareDirectories('/tmp/data0', '/tmp/data1');

 

TravisCI: Use custom Node.js version within container based builds

nodejs binary, custom version, second language

Sometime you may need a special version of Node.js or a recent version within a foreign build environment. But in the modern container-based infrastructure it is not possible to use apt to install custom packets which are not whitelisted. As an workaround, you can download pre-build binaries via wget into your build directory and add the bin/ dir to your PATH. This allows you to use any pre-build third party software without installation.

Example: PERL with javascript testcases#

os: linux

language: perl

perl:
  - "5.24"
  - "5.14"

# skip perl (cpanm) dependency management
# install nodejs into home folder
install: 
  # fetch latest nodejs archive
  - wget https://nodejs.org/dist/v8.8.1/node-v8.8.1-linux-x64.tar.gz -O /tmp/nodejs.tgz
  # unzip
  - tar -xzf /tmp/nodejs.tgz
  # add nodejs binaries to path - this has to be done here!
  - export PATH=$PWD/node-v8.8.1-linux-x64/bin:$PATH
  # show node version
  - node -v
  - npm -v
  # install node dependencies
  - npm install

script:
  # syntax check
  - perl -Mstrict -Mdiagnostics -cw rsnapshot
  # run javascript based tests
  - npm test

 

 

TravisCI: Setup MySQL Tables+Data before running Tests

test, mysql, mariadb, travis, continuous integration, before_install

In case your projects make use of external databases like MySQL/MariaDB you need to setup your continuous integration tests with dedicated testcases including application specific database structures. This requires some initial steps to load the database dump before starting the tests. Thanks to travisci.org you do’t need to do this kind of stuff within your application – just use the test configuration!

Travis+MySQL Server#

First of all, we add MySQL Server as service within our .travis.yml file. This initializes a dedicated database instance for testing. Additionally we hook into the before_install action to initialize our database structure. In this example all SQL commands are loaded from an external file located in our test directory.

language: node_js
node_js:
  - "7"
  - "7.6"
  - "8"
services:
  - mysql
before_install:
  - mysql -u root --password="" < test/travis.sql

Initial Database Setup#

Our Test Database structure is definied within a dedicated SQL file in test/travis.sql. It contains all necessary commands to add a new user, create demo database, create demo tables and finally add some test-data.

# Create Testuser
CREATE USER 'dev'@'localhost' IDENTIFIED BY 'dev';
GRANT SELECT,INSERT,UPDATE,DELETE,CREATE,DROP ON *.* TO 'dev'@'localhost';

# Create DB
CREATE DATABASE IF NOT EXISTS `demo` DEFAULT CHARACTER SET utf8 COLLATE utf8_general_ci;
USE `demo`;

# Create Table
CREATE TABLE IF NOT EXISTS `users` (
  `user_id` int(11) NOT NULL,
  `created_on` timestamp NULL DEFAULT NULL,
  `modified_on` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
  `username` varchar(50) DEFAULT NULL,
  `salt` varchar(20) DEFAULT NULL,
  `password` varchar(50) DEFAULT NULL,
  `email` varchar(150) DEFAULT NULL,
  `firstname` varchar(50) DEFAULT NULL,
  `lastname` varchar(50) DEFAULT NULL,
  `dob` date DEFAULT NULL
) ENGINE=InnoDB DEFAULT CHARSET=utf8;

ALTER TABLE `users`
  ADD PRIMARY KEY (`user_id`);

ALTER TABLE `users`
  MODIFY `user_id` int(11) NOT NULL AUTO_INCREMENT;

# Add Data

Node.js: Log static file requests with expressjs serve-static middleware

nodejs, express, static, logfile, analytics, statistics, download counter

In most cases, every web-application requires some kind of request logging. Especially package downloads will be counted for statistic purpose. By using expressjs, static content is served by the middleware module serve-static.

To count the successfull requests handled by this module, you can hook into the setHeaders callback which is invoked each time a file is ready for delivering (file exists, file is accessible).

Example#

// utility
const _path = require('path');

// expressjs
const _express = require('express');
let _webapp = _express();

// your statistic module
const _downloadStats = require('./download-counter');

// serve static package files
_webapp.use('/downloads', _express.static(_path.join(__dirname, 'downloads'), {
    // setHeaders is only called on success (stat available/file found)
    setHeaders: function(res, path, stat){
        // count request: full-path, file-stats, client-ip
        _downloadStats(path, stat, res.req.connection.remoteAddress);
    }
}));