Tag Archives: optimisation

DAASE Members win Grant to Develop Autonomous Satellites

DAASE Members Earl T. Barr and David R. White are members of a successful consortium to win highly-competitive funding from the Centre for Earth Observation Instrumentation (CEOI). This CEOI funding will enable the application of advanced AI algorithms to the next generation of nano-satellites.

Nano-satellites are a very popular trend in satellite design right now: many small low-powered satellites are built using commodity components rather than the traditional approaches that incorporate high-specced parts in the construction of larger satellites. This greatly reduces their cost and has enabled a new generation of applications, particularly through the deployment of “constellations”: groups of satellites operating in tandem to monitor phenomena such as climate change and protect against impending natural disasters.

David and Earl will be working with industrial experts at space companies Craft Prospect and Bright Ascension, alongside the University of Manchester. Their focus will be on the application of Deep Learning and Code Optimisation methods to the satellite software, taking advantage of emerging hardware platforms that offer enhanced capabilities on-board.

“In the same way that commodity hardware components revolutionised satellite construction, we believe there is potential for a revolution in the software domain,” said David, who will take responsibility for algorithm development on the project.

By applying the power and commoditised tools of machine learning that have emerged over the last decade, the researchers intend to make the satellites more autonomous, reducing the demands on the data connection to the ground station. The work will involve a combination of cutting-edge Machine Learning methods alongside the improvement of industry-standard software through methods such as Genetic Improvement.

“We’re excited to see how far we can take this,” says White, “The timing is perfect for a new wave of satellite applications enabled by AI.”

Opacitor: search-based energy optimization of some ubiquitous algorithms #SBSE

DAASE project blogpost by Dr Sandy Brownlee, Senior Research Assistant, DAASE project, Stirling University

As more of our time is spent on our smart phones, the curse of empty batteries is increasingly important. While battery technology is continually advancing, we can also tackle the problem by reduce the power consumed by the phone. Several factors (like the display, wifi and GPS) impact on power consumption, but the CPU that runs the phone’s software is still a big consumer of energy. For example, the maximum CPU power of a Samsung Galaxy S3 is 2,845 mW, 2.53× the maximum power consumption of the screen and 2.5× that of the 3G hardware.

At the other end of the scale, electricity consumed by server centres run by the likes of Google and Facebook impacts on global climate change: in fact, the electricity consumed by servers was between 1.1 and 1.5% of global electricity production in 2010. So, reducing the energy consumed by CPUs is important!

Much of the code that’s being run relies on the same few algorithms, reimplemented for particular applications and hardware. Ideally code should be retuned for each new application but there are a few things that mean this is difficult in practice, so automated approaches are needed. We already have this for making software run faster – a tool called a “compiler” takes human readable source code and turns it into machine code, making speed improvements along the way. Often making a program run faster will mean it consumes less energy, but some instructions are more power hungry than others, so simply speeding the code up doesn’t guarantee a longer-lasting battery.

We have developed a tool called Opacitor, designed to measure the energy consumed by programs written in Java, one of the most popular programming languages. Opacitor estimates the energy each instruction will consume on a live system. This can be combined with search-based techniques to explore possible changes to a given program, finding a low-energy variation. The work is reported in a journal paper published this month, which focuses on three aspects of reducing the energy consumed by software.

Firstly, programs perform differently depending on the data they process. A common software task is sorting data, and variants of the Quicksort algorithm are a popular way to do this. Quicksort repeatedly chooses a point in the data called the pivot, and rearranges the data around it. The choice of the pivot and the “distribution” of the data being sorted affect how well this works. Our approach generated an automatic way to choose the pivot: this can be tuned to different distributions of data, with the sorting operation using up to 70% less energy than conventional approaches.

Secondly, energy consumption relates to other software properties. Would you exchange a lower quality camera or a few more speech recognition mistakes for another few minutes of battery when it’s nearly empty? We were able to show how multi-layer perceptrons, the core of deep learning and applications like text recognition, can be tuned to balance accuracy and energy use. In fact, sometimes we could get the best of both, but this relationship is complicated enough that automatically exploring the trade-off between the two properties is very helpful.

Thirdly, different components of an application interact in very subtle ways. Modern software typically makes use of libraries (software written by others to do common tasks), and there are often many alternatives doing the same job. It isn’t as simple as choosing the most energy efficient libraries and assembling them into one application, because two parts might be more energy efficient together than separately, perhaps because they share a common resource. We showed that by searching different combinations of “container classes” (software components for storing data), we were able to reduce energy consumption by almost 40%.

Overall then, this seems to be an area ripe for further development, and we hope that this work represents a step towards the ideal where software compilers include energy improvements as a matter of course.




Homage to Ada Augusta Lovelace: DAASE weaves optimisation and operation



We launched the DAASE project on the 16th October 2012, Ada Lovelace day. What an auspicious day for us to launch our project, concerned as it is, with the problem of optimising software code using automated search techniques. At the meeting we reminded ourselves of Ada Augusta Lovelace’s insights, written nearly two centuries previously:

“The order in which the operations shall be performed in every particular case is a very interesting and curious question, on which our space does not permit us fully to enter. In almost every computation a great variety of arrangements for the succession of the processes is possible, and various considerations must influence the selection amongst them for the purposes of a Calculating Engine. One essential object is to choose that arrangement which shall tend to reduce to a minimum the time necessary for completing the calculation.”
(AAL, Note D on the Analytical Engine, 1837)

How prescient these words have turned out to be. Of course, Lovelace is much better known for the quote from Note A on the analytical engine:

“We may say most aptly, that the Analytical Engine weaves algebraical patterns just as the Jacquard loom weaves flowers and leaves.”
(AAL, Note A on the Analytical Engine, 1837)

Perhaps the notoriety of this quotation is hardly surprisingly, coming as it does from the daughter of a poet and a mathematician. What more evocative statement can there be about the subtlety and delight of programming than this statement, uniting the scientific and aesthetic qualities of software?

However, it is this first quote (Note D) in which Lovelace, not only recognises the vital importance of software, but the critical role played by optimisation on the behaviour of software. Her words, written 175 years before our meeting, remain relevant and important to us today. Perhaps they are even more important to us than they were in her own day, illuminated only by gaslight (Thomas Edison would not be born for another 10 years).

In our project we seek not merely to automatically rearrange the order in which the operations are performed, but also to automatically re-engineer new versions of programs which can meet multiple conflicting and completing objectives.

We wish to change, fundamentally, the perception of software from being a craft-based artefact, produced by the skill of human hands, guided by human brains. To us, software is an engineering material, that can and should produced by automated construction and reconstruction, evolution and revolution. The role of the human brain is retained in the description of the objectives to be met. In our vision, these become translated into fitness functions which guide the construction and post-deployment evolution of the software.

In so doing we address the original goals of the many declarative programming and self-adaptive systems initiatives: leading software engineers away from the messy details of how software should achieve its goals to the more valuable role of describing what those objectives should be so that systems can be truly self-adaptive; reconstructing itself, in situ, to respond to its changing environment.

We seek not merely to reduce to a minimum the time necessary (making software faster), but also to attack a wide range of other so-called “non-functional” properties of software (throughput, power consumption, security attack surface; anything that can be described and measured about the behaviour of a system).

We hope that we should be able to change the way in which software is developed. Instead of writing one program by hand to meet one specific objective, we will seek to deploy the optimisation into the code so that it can re-optimise and re-balance itself as its operating environment changes.

Software engineers are both blessed and cursed by the supreme flexibility of their intangible engineering material. DAASE will focus on the blessing. We may say most aptly, that the DAASE project weaves together optimisation and operation so that software becomes dynamically adaptive.

See also:

Five Minute Audioboo interview of Mark Harman by Sue Black about Ada Lovelace