In these Awards, effectiveness is measured in terms of commercial benefits achieved for the client and the customer and is not related to standards of aesthetics, or ‘good’ and ‘bad’ design.
Within your results section, you must clearly detail what has been achieved by the design and how the results measure against the original business objectives whilst providing the wider market context.
Market context and scale of impact is vital when judges are assessing award levels. For example, a dramatic percentage increase from a low base can be less impressive than a small increase from a relatively high base.
How the results performed against original business objectives
The judges are looking to be able to relate the results back to the original objectives of the project. How do your results stack up against the objectives? How challenging were the original objectives? Was the original scope of the project exceeded? To what extent has the design work influenced the client’s business strategy, direction and / or decision making?
Performance: scale of effect and breadth of impact
In order to assess effectiveness, judges will want to fully understand its performance over time, as well as through the scale of effect and breadth of impact of the solution. Explanation and proof of the significance of your results in the relevant business and / or market context is crucial.
Some methods of explaining performance include, but are not limited to:
— Performance against objectives.
— Performance against market norms or against competitive products or companies.
— The performance of a re-design against the performance of the old design.
— The use of research after (and where possible, before) launch to establish a causal relationship between the design and its effectiveness.
Performance can be evaluated in a broad range of ways, however it is the significance and relevance of the results, as well as clear evidence of the link between design and the results, that is key. Use of both quantative and qualitative measures within your entry is encouraged, but make sure you justify how qualitative data supports the case and impacted the results.
Cause and effect of design solution
Judges will be looking for clear proof of a cause and effect between the design solution and the results. Judges are not looking for an in-depth description of the design solution, but will want to clearly understand how the design decisions behind the solution impacted the results. This is your opportunity to explain how design created a shift and enabled business growth with a convincing link between the design solution and the results.
In these awards, effectiveness is measured in terms of commercial, behavioural, societal and broader business benefits achieved and is not related to standards of aesthetics or ‘good’ and ‘bad’ design. Judges want to understand how results were achieved and what made this design solution work.
Need some ideas for what metrics could be used? See page 10 of the Entry Pack (the 2019/20 entry pack will launch at the end of June 19)
Proof of effect and other influencing factors
Design is rarely the only factor influencing a project’s commercial success and often design activity is intrinsically linked to other business activity. The aim of a DBA Design Effectiveness Awards entry is to prove beyond reasonable doubt a cause and effect between the design solution and the results. If other elements were employed that had an effect on the success of a project, you should explore, explain and evaluate the impact of that activity in order to help prove how your work created the results claimed for the design. It is advised to state if you think there were no other influencing factors to show the judges they have still been considered.
You must tackle this area head on and ensure the judges aren’t left with unanswered questions, as a lack of convincing information will go against your entry.
Not sure what counts as an influencing factor? Check out page 11 - 12 of the Entry Pack (the 2019/20 entry pack will launch at the end of June 19)
Judges are looking for clear, concise and comprehensive entries. If the entry is written and structured well, it will deliver a stronger message. Mistakes within the results section, for example inconsistent or inaccurate metrics, greatly go against an entry and every care should be taken to check your entry contains no errors.
Sources and types of data
A project’s success must be linked to a measured fact rather than an assumption so the use of factual research to substantiate your claims is advised. Fact-based data such as Nielsen statistics carry substantially more weight than anecdotal points of view e.g. a product manager’s subjective opinion on a finished product. Always indicate sources of statistics or other information quoted.
Projected and forecast data is not admissible.