Project Overview

We are a group of Interactive Media Design students at the Art Institute of Atlanta. Our mission was to challenge our design assumptions and create a tool to gather feedback. Our idea to do this stems from an existing tool on-line at Visual Ideology. We set out to test various interface questions we have come across in th past. For example, should we place the main navigation on the left side of the page or across the top? Should we use text links instead of icons? There are many examples.

Testing Methods

We carefully considered which technology to use for the survey. Ultimately, Flash was chosen over a CSS site. The reason for this interface was due mainly to the graphing capabilities. It was thought that the Flash site has more opportunity to be "slick"; we wanted users to have an impressive interaction that was easy to use, have smooth transitions, and immediate results. While many of these functionalities are possible without using Flash, the group felt that Flash may give more flexibility and the ability to expand the tool if needed. We felt that the Flash site may offer more "bells and whistles". If we built a successful tool and site, not only would many users see our work, but also potential employers. It was very important to make the interface not just usable, but impressive as well.

Thought was given to the fact that not every user will have the Flash plugin. Experience in any web analytics programs will show that the majority of the public population will have the plugin, but not everyone. Particularly, we anticipated that our audience might largely be web designers, and they were likely to have the plugin. It was a calculated risk that some users will be turned away. We accepted that risk.

The back-end technology uses PHP to interact with a mySQL database where the user clicks are stored. The Flash file is embedded in HTML, and CSS is used to format the page.


We acknowledge the fact that this is not a definitive test. The test provides only two answers to choose from out of many possibilities. The experience is over simplified, and because of that, we can not definitively choose one over the other. Information gathered in this test is merely suggestive of the thoughts and opinions of our users. We will not abandon certain design patterns or practices upon arriving at unsuspecting results; instead, further testing is needed to further isolate the variables being tested, as well as make improvements to the overall data collection process.

The tool itself operates correctly, however, feedback after the launch tells us that users sometimes double click the graphic, probably due to impatient users on slow network connections. The tool is designed for single clicks, therefore a double click causes the tool to register the double click as the results to the current question and also the next question. This action bypasses the user's input, and record an erroneous click. These errors could possibly skew the data. While some might say this completely negates the entire survey, our stance on the subject is that the skewed data becomes minimized due to the large number of responses received. In other words, erroneous data does exist, but we also assume there is non-erroneous data too. This allows us to look at the data and assume some of results are truly indicative of our users thoughts and opinions.

If a person is from a culture that typically reads from left to right, there could be some bias toward the left. If the culture reads right to left, there could be bias toward the right.

Many comments were made from users concerning a left handed or right handed bias. This possibility is acknowledged. Further testing is n eeded to explore how much bias exists. Current data collected concerning left vs. right was collected with graphic placement on the intuitive side; images depicting a left-sided preference was actually placed on the left, and the images depicting right-handed preference was placed on the right To counteract this, more testing can be done by switching the sides and re-testing the same graphics. If the user likes the preferences on the left, the image that is depicting the left should actually be placed on the right, and vice versa. This should counteract any bias.

Your survey itself is fatally flawed and useless. There's no way to express equal preference for either. The bitmaps are also scaled down and blurred so badly that it's an invalid test of what the user would actually see

Some comments were collected concerning the graphic size or quality. It was difficult to design the graphic that was actually depictive of what we wanted to test because ot the size restraints; each graphic had to be small enough to fit inside the Flash tool. The team went through many revisions of the graphics. Some graphics had to zoom out enough to give contextual placement, such as the graphic testing search box placement. This graphic was revised many times, because when zooming in to make it more clear meant losing the context of the placement on the page.

    Future considerations of a revised tool could be
  • to dedicate more screen space for the graphics with just a small navigation bar at the top
  • Move the tool to a CSS based engine, and set the graphics to scale based upon the user's screen size
  • Offer the option to click to a real website that depicts that variable being tested


These findings were based on the current number of users at the time of this writing, which was 19,282 users. The test is still open until December 31, 2007. Another look at the data will be performed at that time. It is anticipated that while we will have more data to look at, the overall findings will still be the same. Any new findings or revelations conceived at that time will warrant a revision in this report.

When starting the survey, we anticipated that our audience might be primarily web designers. Statistics show us that 85% of our users were male, and 66% were under 30 years old. This is indicative of what we have read about the web designer population; they are young males. 280 users were 60 or older, representing 1.5% of the total users.

At launch, the tool had an error in the age groupings in the drop down list. Many 20 year olds wrote us to tell us how displeased they were that the drop down list addressed "less than 20", and the next range was "21 -30." This oversite was corrected in the tool at about 20,000 users.

Question 1

Question 2

How come the text is not english in the second question. How can you expect reasonable feedback on that question when the question asked is "Which block of text is easier to read?

Coming soon

Question 3

Coming soon

Question 4

Coming soon

Question 5

Coming soon

Question 6

Uhh for the submit button one i like my submit button in the middle with the cancel button in smaller text underneathe it

Coming soon

Question 7

… I picked the green one just because of the better contrast with text

Coming soon

Question 8

… On the page about necessary fields in forms, I felt that the color red was better than blue, but highlighting fields is a better option altogether…

Coming soon

Question 9

The [x] close button is influenced by the Windows GUI. If it had been a Red Circle with an X in it like Mac OSX's close button, I would have preferred it on the left side…

Coming soon

Question 10

Some of your questions wont work..... For example the printer icons strongyl depends on the situation where it is used: if the program obviously needs printing the icon without the text is fine

Coming soon

Return Home