VisWorkshops Our chart producer workshops showed that design iterations with multiple visualization experts uncovered key considerations, such as whether a chart type is not only theoretically appropriate but also effective for conveying a message. Despite time constraints in practice, such collaborations can improve visualization effectiveness and therefore lead to better understanding.
ClimateVisInterviews In our expert interviews, joint work for climate data visualizations between climate scientists and visualization experts was seen as crucial to ensure clear information presentation and allow for iterative refinement.
SciAmInterviews At the Scientific American magazine, fact-checkers review visualizations for internal consistency, cross-format accuracy (e.g., web, mobile, print), and alignment with source data. When working with data provided by scientists, Scientific American staff routinely run final visualizations past them to confirm accurate interpretation.
DomainVisStudy In our evaluation of prototypes for medical network data visualizations, domain experts initially favored aesthetically novel low-fidelity designs. However, preferences shifted during high-fidelity testing, where other features, like interactive ones, were introduced and usability played a greater role. This demonstrates the need for parallel prototyping and continued testing beyond early design phases.
Collaborative design processes help integrate different perspectives, such as on technical, communicative, and domain-specific aspects. This can improve how well visual form, data, and viewer considerations align.
ClimateVisInterviews Differences in how lay viewers and experts interpreted the same visualizations highlighted the need for audience testing. Some lay participants misunderstood or disengaged from visualizations that experts considered effective. Testing with target groups was recommended to identify potential mismatches and improve clarity.
VisProducerInterviews Most data visualization practitioners did not use structured user testing, often citing time constraints. Instead, they commonly relied on internal feedback from editors or peers, which was seen as useful and more feasible within tight timelines. However, several interviewees voiced concern about depending solely on internal judgment, emphasizing that creators are not neutral stand-ins for the audience and may misjudge clarity. Structured user testing or external feedback was viewed as more reliable and was more commonly practiced in industry settings where feedback loops are built into the process.
Iterative testing with target viewers can ensure that design choices are easily interpretable and that, for example, the intended message is clear or that color choices do not cause confusion.
VisWorkshops Our workshops and interviews included people of varied ages (including many aged 75+ and many students), with non-academic educational degrees, and varying levels of visualization experience. This diversity helped surface insights that could have been missed in a more homogeneous group.
VisLiteracySurvey In a representative* survey, 37% of respondents voluntarily shared critique of the climate-related visualizations they had engaged with. We gathered 377 comments total from viewers aged 18 to 74, offering constructive feedback.
*representative sample of Austria’s age groups 18–74 years and their male/female gender split, with n = 438
Involving diverse audience groups, for example across age, education, or expertise, can support designing visualizations for different audience groups and may help uncover specific gaps in interpretation.

