Have developed bungie launcher, and currently testing it for our all in one airframe + launcher system.


Will begin offering fully built airframes towards the late August.


We will be at the World Drone Expo this year (Oct 3-4), presenting highly portable and cost efficient airframes.


Will be releasing landing gear and spar holder parts this month.


Andrew reviewed the parts, he likes them! I’m looking forward to see what he builds with these parts. Will be adding vertical stabilizer and landing gear mount parts this week. Will also be testing a new 96″ flying plank this or next week, with a gyro stabilizer.

On the machine learning side, had a research paper accepted into a high impact BMC Journal.


Just launched the KickStarter campaign. Will add alpha kits and prototype parts to the store tonight.


Will submit the kickstarter campaign today for review.


Finished recording and creating pictures of the current state of the modules, beginning to fill out KickStarter page, and preparing a demo release, and post that will be put on RCForums.


Developed a new fuselage/pod. Now testing and refining the design, such that it can take head on and lawn dart crashes.


Kick starter campaign preparation taking a bit longer than expected, primarily the whole video and demo recording. Had to get a new camera with a proper resolution, since recording the whole thing on standard runcam did not work (audio did not come out right).


Further evolved some of the modules, and tested a few new UAVs. Began preparing the kickstarter compaign.


A new method for rapid fixed wing drone construction has been started, the final materials and modules are currently being tuned, to be released by the end of February. Also we find that the KF airfoils seem to perform on par with true airfoils, particularly within the Re numbers in which the UAV aircraft operate in.


Sher Industries has released a first, small scale hobby grade prototype, based on the materials we’ve chosen ad tested to be substantially stiffer, stronger, scalable, and more reliable and crash resistant than foam+glue, which is used in most non military/industrial grade drones.


Sher Industries LLC has filed patents for new UAV systems, both of the Utility and Design types.

Work has also began on DXNNv3, and UAV simulator.


Have been exploring numerous materials, and methods of constructing fixed wing UAV systems, which should greatly improve the speed, cost effectiveness, and reliability and crush resistance of the air frames for users.


S.I. LLC has been testing various types of UAV airframes, constructed from numerous materials, used both in the hobby grade and industrial grade sectors. The goal is to find material that will make the construction of UAV systems extremely cost effective, allowing us to pass on our cost saving methods to our clients, and to release what we have found, and the methods we have found, to our clients if they wish to construct some of the parts of the airframe themselves. We have found some materials that we find to be easier to work with than foam (especially if one does not have a foam cutter), and which scales well, is prevalent and easy to acquire, and is scalable and allows us to use screws and other mechanical fastening methods, rather than glue. The material was rather prevalent about a decade ago… we think it has an incredible synergy when combined with modern day rapid prototyping and manufacturing technologies.


Moved all hosting to a new private server. And have commenced on developing mechanical aspects of a UAV system. What is to come will be related to the mentioned drone systems in the first chapter of my book: Handbook of Neuroevolution Through Erlang

Q-Learning has been added, A* has been added, and deep learning (SDAE, and numerous variations of denoising and autoencoder based stacks, with backprop included) has been added to DXNN as modules that can be used as preprocessors in Sensors, post processors in Actuators, or as neurodes. I have been rather lazy with regards to updating this site and pushing the code to GitHub, and the regularity shall improve within the end of the month as I move the website to a different platform.

DXNN2 now supports neural-micro-circuits. The system has been further tuned, standard double-pole-balancing benchmark is solved in under 1000 evaluations, and under 10000 evaluations for XOR-AND-XOR.

DXNN1 & DXNN2 have been switched to the use of multi-objective, augmented hall-of-fame selection approach. This has significantly improved the performance of the system. The standard double-pole balancing benchmark on which a multitude of neuroevolutionary systems aretested and thus can be used for comparison with other systems, is now solvable in under 1500 evaluations (as compared to the previously state of the art 3000+ evaluations). The basic deceptive problem XOR-AND-XOR is solvable in under 25k evaluations on average, as compared to 500k evaluations required by systems such as NEAT when using novelty-search.

Neural Micro-Circuit capabilities have been added to the DXNN1 system. A DXNN2 version will be added soon. This allows for each node to be not a single neuron, but a feed-forward neural circuit. This increases the capabilities of each node, sways the amount of computation to the number of messages towards Erlang’s favor, and generalizes the capabilities of each node.

DXNN V2 license changed to Apache 2.0.

New multi-objective selection algorithm added to DXNN. New type of computational nodes added (Neural-Micro-Circuits). New benchmarks added: DXNN can now be used in epitope-prediction applications, XOR-AND-XOR deceptive problem, and Deceptive target finding.

DXNN2 has been released to GitHub, a decoupled version of the 2d simulator Flatland has been released:

Both DXNN and the second generation DXNN are now available on github. Crossvalidation pipeline has now been automated for the DXNN system, allowing for the researcher to specify whether to just train, or train, validate, and test the evolved agents. The benchmarker module has been updated, with the experimental data now saved to mnesia table experiment, to be used and accessed at any time, with every experiment having its own id/name. Finally, my book: Handbook of Neuroevolution Through Erlang, can now be preordered on Amazon, or from:

DXNN2 has been released to members, and will be released to GitHub shortly. It is a different, more modular, cleaner architecture of the system, and the one developed within my upcoming book: Handbook of Neuroevolution Through Erlang, published by Springer, hard cover ISBN: 978-1-9614-4462-6. New modules are coming out so that the DXNN system can be used with bioinformatics based problems (epitope prediction, gene marker prediction…). And finally, a Neural Network Research Repository will become open within a month.

I will be present at this year’s Genetic Algorithms and Evolutionary Computation (GECCO) conference. Giving a summary of the use of DXNN in financial analysis, and automated currency trading.
03.05.2012The following are the slides from my recent presentation at the Erlang Factory: Unfortunately, even though a really great talk was prepared, due to the difficulty of trying to connect my Linux machine to the projector, the talk started 15 minutes late, and there was not enough time to even finish it. I’ll now be carrying secondary machines with OSX/Windows to conferences, as it seems that linux is just not yet ready to interface with other standard systems.

I’m currently exploring the extension of the Kerl module that interfaces Erlang with Player/Stage, to use NIFs. Will add the project section at the end of this semester, it will list the projects, people involved, progress, and phases completed.

1. Beginning integration of evolutionary strategies into DXNN.
2. Modifying the DXNN architecture, from using the cortex element as a gatekeeper, to using it as a synchronizer. This will allow for an easier path to decoupling the system completely.

Unfortunately things are taking longer than expected due to the PhD program and the book writing taking substantially more time than originally expected.

I plan on releasing the source code for DXNN by August, as soon as I finish documenting it. I will host it on GitHub.

A while back I have began writing a text book on the construction of Topology and Weight Evolving Artificial Neural Networks (TWEANNs) through Erlang. It is a How-To book covering in detail all the steps from simulating a single neuron to developing a fully distributed TWEANN platform. The book will also present a number of application areas (Time Series Analysis, Robotics, ALife…), and provide the source code for all the projects. The book is scheduled for completion by the end of this year.