By Mamadou-Abou Sarr
Welcome to ArteQuants
I’ve always lived at the intersection of two seemingly distant worlds: quantitative finance and the arts. Yet for me, the elegance of a mathematical optimization and the emotional resonance of a piece of art evoke the same spark, intellectual, precise and poetic.
ArteQuants was born from this dual passion: a space where rigorous analytics meet cultural assets. Here, I explore how quantitative thinking can illuminate the value, structure, and dynamics of art, wine, collectibles, and beyond. Whether you’re a data enthusiast, a collector, or simply curious about the hidden patterns in culture, ArteQuant invites you to join me on a journey of research, reflection, and discovery.
Issue No. 1 - Seeing in Code
It all begins with an idea.
Andreas Müller-Pohle’s Digital Scores and Entropia
I wanted my first ArteQuants essay to begin exactly where my two passions meet, in that strange, beautiful space where art becomes data, and data becomes art. Few artists embody that bridge as powerfully as Andreas Müller-Pohle.
His series Digital Scores and Entropia have been part of our collection (The Sarr Collection) for years. They continue to challenge me every time I look at them. These aren’t just photographs. They’re systems. Translations. Mathematical events. Images that have been broken apart, re-organized, coded, decoded or reinvented.
What I love about Müller-Pohle is that he invites us to question what an image really is:
Is it a picture?
A pattern?
A string of numbers?
A probability distribution?
A message waiting to be interpreted?
Andreas Muller-Pohle, Digital scores V (after Nicephore Niepce), 2001
Quantitative Angles on Müller-Pohle’s Work
Information Theory — Digital Scores
Byte Count & Entropy: ~7 million bytes encode Niépce’s photograph.
Shannon Entropy: 4.21 bits/symbol — structured, not random.
Compression Ratio: ~1.45× signal dominates noise.
Finance Parallel: Like separating alpha from beta in portfolio optimization.
Thermodynamics — Entropia
Entropy as Disorder: Visualizes decay and transformation.
Irreversibility: Cultural assets degrade like thermodynamic systems.
Finance Parallel: Mirrors systemic risk and volatility clustering.
Shannon Entropy Formula
To quantify the uncertainty embedded in Müller-Pohle’s Digital Scores, I use the classic formula from information theory:
Entropy Analysis of Digital Scores
$$ H(X) = - \sum_{i=1}^{k} p_i \log_{2}(p_i) $$
Where:
- H(X) is the entropy of the symbol stream
- k is the number of unique symbols
- pi is the probability of each symbol i
Results:
- Entropy per symbol: 4.21 bits
- Total entropy over 7 million bytes:
Htotal = 7,000,000 × 4.21 ≈ 29.47 million bits - Compression estimate: ~1.45× (vs max entropy of 6.09 bits)
Symbol Distribution
Alphabet size: 68 unique symbols
Top 20 symbols: ~72% of all characters
Skew: Strong bias toward whitespace, punctuation, and common letters
Shannon Entropy
Entropy per symbol: 4.21 bits
Total entropy: ~29.47 million bits
Compression estimate: ~1.45× (vs max entropy of 6.09 bits)
Structural Dependencies
Lag-1 autocorrelation: ~0.11
Mutual information (lag 1): ~0.07 bits
| Metric | Value | Insight |
|---|---|---|
| Shannon Entropy | 4.21 bits/symbol | Structured, compressible stream |
| Total Entropy | ~29.47 million bits | High information density |
| Unique Symbols | 68 | Rich but skewed alphabet |
| Top 20 Symbol Share | 72% | Strong encoding bias |
| Lag-1 Autocorrelation | 0.11 | Local structure present |
| Mutual Information (lag 1) | 0.07 bits | Predictable adjacency |
ArteQuants Insight
The 7 million bytes in Digital Scores are not just technical - they’re symbolic. Müller-Pohle transforms Niépce’s analog legacy into a digital score, and entropy quantifies that transformation. Just as asset prices encode market behavior, this image encodes visual history. ArteQuants treats cultural assets as data-rich systems, revealing the hidden logic behind beauty.
Let’s keep decoding culture, one byte at a time.