As design education outgrows traditional critique-based assessment, this article explores how AI-driven analytics—specifically scale and cluster detection—can provide instructors with deeper, faster insight into students’ multiscale design organization, complementing existing dashboards and addressing gaps in human-led evaluation.As design education outgrows traditional critique-based assessment, this article explores how AI-driven analytics—specifically scale and cluster detection—can provide instructors with deeper, faster insight into students’ multiscale design organization, complementing existing dashboards and addressing gaps in human-led evaluation.

AI-Powered Analytics Offer New Pathways for Evaluating Multiscale Design Work

2025/12/08 07:03
5 min read
For feedback or concerns regarding this content, please contact us at crypto.news@mexc.com

Abstract and 1. Introduction

  1. Prior Work and 2.1 Educational Objectives of Learning Activities

    2.2 Multiscale Design

    2.3 Assessing Creative Visual Design

    2.4 Learning Analytics and Dashboards

  2. Research Artifact/Probe

    3.1 Multiscale Design Environment

    3.2 Integrating a Design Analytics Dashboard with the Multiscale Design Environment

  3. Methodology and Context

    4.1 Course Contexts

    4.2 Instructor interviews

  4. Findings

    5.1 Gaining Insights and Informing Pedagogical Action

    5.2 Support for Exploration, Understanding, and Validation of Analytics

    5.3 Using Analytics for Assessment and Feedback

    5.4 Analytics as a Potential Source of Self-Reflection for Students

  5. Discussion + Implications: Contextualizing: Analytics to Support Design Education

    6.1 Indexicality: Demonstrating Design Analytics by Linking to Instances

    6.2 Supporting Assessment and Feedback in Design Courses through Multiscale Design Analytics

    6.3 Limitations of Multiscale Design Analytics

  6. Conclusion and References

A. Interview Questions

\

2.3 Assessing Creative Visual Design

As Gero and Maher discuss, in creative design, differences in the representation of ideas are a rule rather than an exception [33]. This makes assessing creative design challenging. A common pedagogical tool is to organize “design critiques”, where instructors, peers, and invited jury provide students with feedback on their work [22, 57]. Feedback is based on a broad range of criteria, across dimensions, such as product, process, content knowledge, and communication [25]. Criteria for product visual design focus on characteristics such as color, form, composition, and layout [25]. However, instructor-led assessment is not able to keep up with the growing demands of design education [47].

\ Approaches developed by creativity and crowd researchers align with visual design assessments in courses. Kerne et al. developed creativity metrics for assessing information-based ideation tasks and activities [44]. Their ‘Visual Presentation’ metric includes criteria such as whitespace, alignment, and organization of ideas in lines, grids, or other shapes. Human raters applied these metrics to assess free-form creative assemblage of ideas. Xu et al. developed guidelines for crowd assessment of visual design work, including criteria of proximity, alignment, repetition, and contrast [82]. For multiscale design, Lupfer et al. measured the number of scales used, by counting the number of times one needs to zoom in, in order to make inner elements legible [52]. Despite the potential of these approaches to assist design courses, they face similar limitations as instructor assessment, i.e., human support may not always be available.

\ Computational approaches have the advantage of processing data at speed and providing on demand assessment [38]. Reinecke et al. assessed website aesthetics by developing a regression model based on attributes such as color, symmetry, and the number of images and text groups [64]. Oulasvirta et al.’s Aalto Interface Metrics web service is aimed at providing assessments of a graphical user interface design, to help designers in identifying and addressing the shortcomings [59]. For multiscale design, Jain et al. developed a computational model based on spatial clustering, which identifies scales and clusters present in design work [40]. However, prior computational approaches did not focus investigation on experiences in design education course contexts.

\ Our investigation presents data about instructor experiences with AI-based analytics by invoking Jain et al.’s model [40] to compute multiscale design analytics. We consider prior work on the derivation and presentation of learning analytics in order to subsequently inform the approach we take for presenting multiscale design analytics in our research artifact.

2.4 Learning Analytics and Dashboards

Learning analytics and dashboards technologies have been found to support instructors in identifying student problems and intervening, which improved student retention and success [6]. In lecture-based course contexts, analytics—e.g., the number of times a student accessed a resource, time spent, and length of textual annotations—have assisted instructors in assessing student understanding [24]. Likewise, dashboards have proven effective in lecture-based contexts, providing a quick understanding of student progress through representations such as tables and graphs [27, 78].

\ Design course contexts involve project-based learning [28]. As Blikstein discusses, in project-based contexts, there is a need to measure more open-ended and complex characteristics, which can provide instructors with insights into students’ creative processes [13]. In design course contexts, specifically, Britain et al.’s study surfaced this need, as they presented Fluency analytics—i.e., the number of elements, words, and images—to design instructors [16]. While the instructors found Fluency useful in gaining insight into students’ efforts across various dimensions, they desired more sophisticated analytics. In our investigation, we present AI-based analytics to instructors—the number of scales and clusters—providing them insights into students’ multiscale design organization.

\ Prior analytics dashboards focus on presenting facts. But with advances in AI and its applicability in diverse domains, the community has begun researching AI-based analytics, which include inferences [36, 48]. Presenting AI results in a comprehensible manner is vital, so that users can trust the system and rely on its assistance [67]. Among prior work, Oulasvirta et al. provide visualizations of assessed visual design characteristics—e.g., of clutter, colorfulness, and white space—within website design. However, they did not assess or present multiscale design characteristics. The present research focuses on conveying the meaning of scales and clusters analytics computed with AI recognition. For this, we integrate the dashboard presentation of analytics with the actual design work they measure.

\ \

:::info Authors:

(1) Ajit Jain, Texas A&M University, USA; Current affiliation: Audigent;

(2) Andruid Kerne, Texas A&M University, USA; Current affiliation: University of Illinois Chicago;

(3) Nic Lupfer, Texas A&M University, USA; Current affiliation: Mapware;

(4) Gabriel Britain, Texas A&M University, USA; Current affiliation: Microsoft;

(5) Aaron Perrine, Texas A&M University, USA;

(6) Yoonsuck Choe, Texas A&M University, USA;

(7) John Keyser, Texas A&M University, USA;

(8) Ruihong Huang, Texas A&M University, USA;

(9) Jinsil Seo, Texas A&M University, USA;

(10) Annie Sungkajun, Illinois State University, USA;

(11) Robert Lightfoot, Texas A&M University, USA;

(12) Timothy McGuire, Texas A&M University, USA.

:::


:::info This paper is available on arxiv under CC by 4.0 Deed (Attribution 4.0 International) license.

:::

\

Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact crypto.news@mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

200,000,000 XRP out in 2 Weeks: What’s Going On?

200,000,000 XRP out in 2 Weeks: What’s Going On?

The post 200,000,000 XRP out in 2 Weeks: What’s Going On? appeared on BitcoinEthereumNews.com. In the last 14 days, wallets with between 1,000,000 and 10,000,000 XRP have reduced their holdings by around 200,000,000 tokens. This change, displayed by Santiment data, suggests that some of these holders are leaving the mid-level group, reducing their combined holdings to around 6.74 billion XRP.  They are not small retail accounts, but they also do not match the scale of the very largest XRP players.  Such movements usually matter because of the amount of supply in control, which can influence short-term trends. Of late, these whales have clearly been reducing their holdings. The XRP price has been trending down while XRP has been levitating close to $3, bouncing between $2.90 and $3.30, without going in a clear direction.  The fact that these wallets are selling could be one of the reasons why the token has struggled to increase in value, even though the general crypto market has had a mix of positive and negative days. Why do XRP whales sell? One possibility is that these holders are simply taking profit after XRP’s climb earlier in the summer.  Another reason is caution: with the Federal Reserve’s interest rate decision coming up and money availability across markets looking uncertain, some investors may prefer to derisk their exposure now instead of holding amid price chaos. It is important to know that not all of these tokens have been moved to cold storage.  The number of XRP going into exchanges has gone up, which suggests that some of the 200 million XRP has been sent to trading platforms. This means that some of the selling pressure could be transferred to the open market if those tokens are moved directly there. Source: https://u.today/200000000-xrp-out-in-2-weeks-whats-going-on
Share
BitcoinEthereumNews2025/09/18 08:45
Smart investors earn $6,875 daily on ProfitableMining, the leading cloud mining platform.

Smart investors earn $6,875 daily on ProfitableMining, the leading cloud mining platform.

In the volatile cryptocurrency market, price fluctuations are becoming increasingly severe. Simply holding onto your coins and waiting for them to rise is no longer a safe strategy. More and more experienced investors are turning to a more stable approach—ProfitableMining cloud mining, with becoming their preferred platform. They aren’t waiting for market fluctuations; they’re generating […]
Share
Cryptopolitan2025/09/18 01:00
Worldcoin price at risk of $0.20 breakdown amid rising exchange inflows and bearish setup

Worldcoin price at risk of $0.20 breakdown amid rising exchange inflows and bearish setup

Worldcoin price has dropped over 30% this month as market sentiment remains risk-off amid geopolitical tensions in the Middle East. According to data from crypto
Share
Crypto.news2026/03/27 18:24