MENU

Software bottlenecks necessitate innovative development tools

Software bottlenecks necessitate innovative development tools

Feature articles |
By eeNews Europe



Producing a working software system within project timescales and budget constraints is of paramount importance. These aren’t the only challenges of course. Engineers also need to maximise the efficiency of their design to gain a competitive advantage. They must undertake optimisation in relation to a variety of key parameters, such as:

– Completeness and feature set of their design

– Hardware cost

– Code size

– Performance level

– Power budget

The way in which optimisations are applied is still relatively crude in comparison to the cutting-edge embedded systems being designed. Existing code generation tools make optimisations using processor-focused instructions, but these are, as a result, inadequate for anything but the most rudimentary of sequences. The numerous interactions between the processor/memory have the greatest influence over the performance of an embedded system, so these should be appropriately addressed. Traditional tools try to identify the most common control flows through a program then optimise for this dynamic behaviour at the expense of other, seemingly less important control flows.

However in reality, programs exhibit highly dynamic behaviours, and even infrequently-taken control flows can make dramatic changes to the processor/memory state, which affect the behaviour of other flows. Profiler tools can provide developers with information on which system resources are being expended (such as memory bandwidth, cache hits/misses, etc.). However, this isn’t the whole picture – it says nothing about the fundamental causes. The interactions of software and hardware are so complex in modern processor systems it is just too difficult for programmers to directly control many of these effects from their source code. Comprehending the underlying relationships at work here is frequently beyond the faculties of even the most skilled engineering professionals.

In reality there is a considerable gap between hardware’s theoretical performance, and new hardware features such as high performance tightly-coupled memories, and what can be achieved with traditional software development tools. This is starting to cause major issues and, in many cases, unnecessary compromises are having to be made in order to alleviate acute time-to-market pressures. It is for such reasons the vast majority of design projects embarked upon do not manage to achieve their initial goals, coming up short of their true potential and not fully utilising the performance features available with modern hardware.


Optimising in the “bigger picture”

Programmers should concentrate on optimising the overall algorithm and architecture of their software. The fundamental problem at the heart of the software development tools’ inadequacies is their inability to control low level hardware/software interactions and the requirement that there is still a “human in the loop” trying to optimise fundamental aspects of the software’s execution by altering the source code.

Such efforts to remove execution bottlenecks using current optimisation techniques are poorly coordinated and lead to new problems being created elsewhere. This is akin to “balloon squeezing” – rather than making the balloon smaller it just pushes air around and causes bulges elsewhere. Even seasoned engineers will struggle to coordinate their efforts under such circumstances.

It has become clear that the current approach taken is not capable of furnishing engineers with the level of functionality that they require. The tools available only encourage development and optimisation procedures which are both unpredictable and time consuming. Engineers desperately need development tools which are more sophisticated, supporting hardware/software interactions and whole-program optimisations which can’t be expressed in the source code, whilst also being much less labour intensive to use. This would give them the ability to reach more optimal designs while shortening the amount of time taken up by the project, the technical resources that need to be allocated to it and the associated costs. As well as curbing upfront investment, a major upshot of these shorter development cycles will be more rapid progression to the point when serious revenue can be generated by the end product.

There are now those who are starting to recognise that a deterministic, highly automated methodology to tackle hardware/software and whole system optimisations will allow the development process to be far more efficient. Engineers will have more time to exercise their creativity to the fullest, letting the tools automatically optimise their software into the available hardware resources.

Though existing whole-program compilation techniques are limited by their compatibility with widely deployed compilation/profiling systems and source code, advanced development tools are now starting to emerge that will offer the capacity to significantly enhance system performance and shorten the time needed to complete embedded design projects.

By ensuring the entire code generation flow is aware of the entire hardware, including the coupling of the processor and the memory system, optimisations embarked upon can achieve results that are beyond the scope of source code changes, existing tools and increasingly outdated manual techniques. Technology start-up Somnium is pioneering this approach. The company is working closely with leading semiconductor manufacturers to produce both generic and product-specific software development solutions that, in contrast to how current conventional optimiser solutions operate, carry out highly device-specific optimisations automatically without requiring any form of profiler feedback. The upshot of this is boosted productivity, with generation of programs which are smaller and more efficient for all dynamic behaviours.

In conclusion; embedded system designs often reach completion later than expected and are not successful in satisfying the various cost, performance and functionality targets that were set at the beginning. To a large extent, this is due to the widening disparity between what engineering teams are seeking to accomplish; and what their development tools can deliver.

The conventional manually-based procedure of design optimisation must be replaced by a process which can cope with the inherent complexity and in which the human element has thereby basically been eliminated. Instead of optimisation being centred totally on the processor element, the system as a whole must be examined, with the available on-chip and off-chip memory systems being key to the optimisation strategy.

The items discussed in this article all point to a compelling need within the industry for a new breed of software development tools which are ‘device aware’ and where both the code generation and data sequencing processes are automated in their entirety. Through efficient and deterministic optimisation, as proposed here, engineers can make much better use of the hardware they have at their disposal – so they can achieve more with less.

David Edwards is CEO/CTO and Founder, Somnium Technologies

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News

Share:

Linked Articles
10s