Automatic UI Generation
- Thomas Memmel, HCI Lab University of Konstanz, Germany
Recently, I am doing some research about the automatic generation of
user interfaces - based on models and formal description languages
like e.g. UIML, XUL, IDS, XIML etc..
Most research papers I read argue that automatic code generation is
inappropriate and would most probably lead to standard and non-
On the other side, there is a great movement in the software
engineering community towards model-based development. One corner
stone of MDB is the transformation of models into code. Following
usage-centered design, the traceable transformation from models into
UI code is exactly what would characterize a design by engineering
(as opposite of design by trial-and-error, like user-centered
Well, I think it should remain the assignment of the designer to fill
in detail where abstract models previously specified the core UI
concept (compare Normans idea of activity-centered design and the
important role of the designer in its application).
But I could imagine, that the generation of code based on more or
less abstract descriptions could be useful whenever it is necessary
to have high-fidelity prototypes in early design phases. Such
prototypes need to be easy to build and easy to change in order to
keep the process "agile". (It is very often necessary to externalize
your ideas very early to be able to discuss and evaluate them with
stakeholders that want to see and experience something - rather than
looking at abstract designs. You won't be able to stay abstract in
project environments where the corporate design is an important issue
and managers want to see how it will look like - sad but true).
I therefore took a look on the iRise studio toolkit (there have been
some posts several weeks ago). Indeed, it is really easy to build
usage scenarios and transform them into running code with iRise. And
you can choose how detailed the prototype should look like. Great
tool. I think it qualifies as vehicle for particpatory design and
rapid, dynamic hi-fi prototyping.
Thus, I thought about the advantages and disadvantages of prototyping
tools like iRise compared to those working with XML-based modeling
I guess the most important difference is the necessity of learning a
modeling language if you work with XML-based solutions. On the other
hand, XML documents may be very useful as a specification of how the
final product should look like - a machine readable add-on to the
visual prototype that can be generated out of it.
I would be interested in your opinions and what kind of experience
you perhaps gathered using model-based software engineering methods.
Furthermore, as I think that there is a need of having more and more
visual specifiaction (prototypes that make interactivity lookable and
feelable) rather than specification with text, of what kind must a
visual specification language be? Have you been in situations where a
kind of visual specification of the UI was requested as deliverable
to programmers? What tools did you use?
- Hi, Thomas. I'm a programmer rather than a UI person, so take this
answer for what it's worth.
Thomas Memmel wrote:
>On the other side, there is a great movement in the softwareIs there a great movement for this? I've have seen tools vendors and
>engineering community towards model-based development. One corner
>stone of MDB is the transformation of models into code. Following
>usage-centered design, the traceable transformation from models into
>UI code is exactly what would characterize a design by engineering
>(as opposite of design by trial-and-error, like user-centered
standards committees very excited, but I don't know any productive
developers who have shifted over to MBD tools. When I contrast this with
the rapid adoption of a variety of open-source tools and libraries, or
the recent rise of things like refactoring or unit testing, I suspect
that MBD is more the latest buzzword-compliant fad than an actual
movement. Of course, my sample isn't random, so it's perfectly possible
that I'm missing something.
>I would be interested in your opinions and what kind of experienceFor things where the visual appearance is important, I think visual
>you perhaps gathered using model-based software engineering methods.
>Furthermore, as I think that there is a need of having more and more
>visual specifiaction (prototypes that make interactivity lookable and
>feelable) rather than specification with text, of what kind must a
>visual specification language be? Have you been in situations where a
>kind of visual specification of the UI was requested as deliverable
>to programmers? What tools did you use?
prototypes are great. I have received excellent prototypes built up in
things like HTML and image files. Of course, I've also received
excellent prototypes on whiteboards and napkins. My concern with more
UI-specific prototyping tools (e.g., drag-and-drop builders) is that
they might constrain innovation as much as they would support it.
Again, this could be a function of the designers I happen to know, but
the best ones I've worked with all tend to use pretty general tools
(e.g., pencils, Photoshop). I've never asked why, but I'd guess that
they already have the knowledge of the domain that a more focused tool
would try to embody and their facility with general tools is such that a
specific tool doesn't get them much performance gain.
However, I'm very suspicious of any attempt to treat prototypes as
specifications rather than prototypes. The assumption behind a
specification is that thinking and learning are now basically done, that
all relevant factors are known and have been integrated. Perhaps one day
I'll see a specification where that's really the case, but so far I've
never seen one come close.