7th ANNUAL COMPTEXT Conference 2025

Computational Method Conference

By Isaac Bravo in conference

April 24, 2025

Abstract

Here is my presentation given along with Sean Palicki (Main author) titled Don’t Look Up: Evaluating the Tradeoff between Accuracy and Sustainability of LLMs for Text Analysis., at the 7th ANNUAL COMPTEXT Conference Vienna.

Date

April 24 – 26, 2025

Time

12:00 AM

Location

Vienna, Austria

Event

Abstract:

Large language models (LLMs) are widely used as research tools, but their highresource demands raise significant environmental concerns. While LLMs offer advan-tages in certain applications, their high energy demands prompt a necessary questionfor social scientists: Is it worth considering LLMs for every text analysis task? Thisstudy systematically evaluates the trade-off between performance and energy usageacross computational text analysis methods, including dictionaries, trained classifiers,and open “local” LLMs. Applying sentiment analysis, multi-class classification, andnamed entity recognition to political documents, we measure energy consumption,CO2emissions, correlation with human raters, F1-Score and processing time. We findthat LLMs perform well on sentiment analysis, closely matching human judgment,but at relatively high environmental costs. For classification and named entityrecognition, task-specific models achieve superior accuracy and low environmentalimpact. Contrary to multi-purpose LLM benchmarks, larger parameter counts donot guarantee better performance on text classification tasks. Introducing a CO2Adjusted F1-Score, we observe that smaller and more efficient models, such asMistral-Nemo (12B), outperform larger quantized models like Deepseek-R1 (32B).Our findings highlight the necessity for thoughtful model selection, rather thandefaulting to LLMs. A ”right-fit” approach, employing task-specific, lighter methodsoffers performance and sustainability benefits.

The Presentation:

Posted on:
April 24, 2025
Length:
1 minute read, 183 words
Categories:
conference
See Also: