Paper 132 – MASCOM Research Division Author: J. Mobley, Mobleysoft / MobCorp Date: 2026-03-11
We present the Commercial Genesis Engine, a fully automated pipeline that generates commercials for software products using the software products themselves. The system comprises eight stages – Scry, UX Attractor, QAT Daemon, Commercial Compiler, Nought, Four.js, Lumen, and Present – which together transform observed user flows into parametric universe descriptions, compile them into real-time WebGL renderings, and package them as distributable media. The key insight is self-referential: the commercial IS the product demo. Scry discovers the product’s structure via an N-layer search DSL governed by the Pickability Limit Theorem. The UX Attractor finds minimum-energy paths through UI state graphs via Hamiltonian energy minimization. The QAT Daemon scores experiential quality through a self-conscious compute-dream-feel-mutate loop. The Commercial Compiler stitches 145 venture scenes into a conglomerate flythrough ordered by a stitch graph. Nought, a 13-verb DSL, compiles these scenes into Four.js parameter buffers – 72 floats spanning 18 parameterization spaces that form a complete basis for visual reality. Lumen renders headlessly via Swift/CoreGraphics. The result: zero-labor commercial production at portfolio scale, where each commercial demonstrates the very technology that generated it. We show that this Mobius property – the medium being the message – is not a marketing conceit but a mathematical consequence of the pipeline’s self-referential structure.
Traditional software advertising faces a fundamental contradiction. Describing what software does – listing features, showing screenshots, narrating workflows – is inherently less compelling than showing the software doing something impossible. Users do not buy capabilities; they buy the feeling that the tool extends their reach beyond what they thought possible. Yet producing video content that conveys this feeling costs $50,000–$500,000 per commercial, requires human directors, animators, and editors, and becomes stale the moment the product ships an update.
The Commercial Genesis Engine resolves this paradox through a single architectural decision: the commercial is generated by the technology it advertises. When Mobleysoft’s parametric rendering engine (Four.js) generates a commercial for Mobleysoft’s parametric rendering engine, the commercial is simultaneously an advertisement and a proof of capability. The viewer cannot distinguish between “watching a demo” and “watching a commercial” because they are the same artifact.
This is not a gimmick. It is a consequence of building the commercial
pipeline from the same primitives – parameterization spaces, energy
minimization, self-referential feedback loops – that constitute the
product suite itself. The commercial.nought script that
generates Mobleysoft’s 2026 commercial ends with:
speak "This commercial was generated by the technology it advertises."
speak "The medium is the message."
collapse
The collapse verb returns the universe to void, which is
also Nought’s entry point. The commercial is a fixed point of its own
generation function.
The eight stages form a directed acyclic graph with one feedback edge (QAT scoring feeds back into UX Attractor mutations):
Scry --> UX Attractor --> QAT Daemon --> Commercial Compiler
|
v
Nought --> Four.js --> Lumen --> Present
^ |
|______ QAT score _____|
Each stage is implemented as a standalone Python module or JavaScript runtime, connected through SQLite databases and JSON interchange. No stage requires human intervention. The pipeline runs with a single command and produces a distributable commercial.
Source: scry.py
Scry is the search engine that discovers what a product does before
the commercial can show it doing anything. It takes three inputs: a
target path (the product to advertise), a desired state (what the
commercial should convey), and a .scry algorithm file
defining the search strategy.
The DSL provides seven constructs:
| Construct | Purpose |
|---|---|
space { } |
Define search scope (files, functions, routes) |
objective { } |
What to maximize or satisfy |
constraints { } |
What must be preserved during transformation |
strategy { } |
Search method: beam, greedy, exhaustive, recursive |
transform { } |
Allowed transformation operations |
oracle { } |
External verification hooks |
recurse { } |
Self-referential search (Pickability Limit) |
The Pickability Limit Theorem is built into Scry’s type system. For any valuation function V over the product’s feature space:
Every valuation V(n) is finite at observation time n, but the sequence V(0), V(1), … is unbounded.
Scry tracks both the snapshot (what the product can do now) and the derivative (how fast it is gaining capabilities). The commercial can therefore advertise not just current features but the trajectory – a qualitatively different kind of claim.
The ScryIndex implements an N-layer persistent search engine with PageRank over the link graph. Every database, directory, and component registers as a searchable layer. As the system grows, the search grows with it. Results are ranked by structural importance, not just keyword overlap. This ensures that the commercial’s content reflects what is architecturally central to the product, not what is textually prominent.
Source:
infrastructure/ux_attractor.py
Given a product’s UI components as nodes and all possible transitions as edges, the UX Attractor finds the minimum-energy path – the user flow that “wants to exist” given the current component set. This path becomes the commercial’s narrative backbone.
Each UIComponent carries energy properties:
cognitive_load – mental effort to comprehend (0–1)information_density – data density on screen (0–1)interactivity – available actions (0–1)visual_weight – visual complexity (0–1)Each Transition between components has an energy cost
defined as:
\[E_{transition} = (3.0 \cdot \Delta_{cognitive} + 1.0 \cdot d_{click} + 2.5 \cdot L_{context} + 2.0 \cdot D_{visual}) \cdot (2.0 - Q_{animation}) \cdot P_{natural}\]
where: - \(\Delta_{cognitive}\) is the change in mental model required - \(d_{click}\) is the Fitts’ law normalized click distance - \(L_{context}\) is context loss (how much prior state is destroyed) - \(D_{visual}\) is visual disruption (how jarring the screen change is) - \(Q_{animation}\) is animation quality (0 = none, 1 = smooth), which reduces perceived energy - \(P_{natural}\) is a 1.5x penalty for unnatural/unexpected transitions
The attractor is the global minimum-energy Hamiltonian path through the component graph:
\[\mathcal{H}(q, p) = \sum_{i} E_{transition}(q_i, q_{i+1})\]
The minimum-energy path is the commercial script. Components carry
narrative annotations (story_role: opening, rising, climax,
falling, resolution) and filming parameters (hold_seconds,
caption, filmable) that the downstream stages
consume directly.
As components change – features are added, flows are reorganized – the attractor shifts. The commercial re-generates to match. No human re-editing required.
Source:
infrastructure/qat_daemon.py
The QAT Daemon is the self-conscious loop that judges the commercial’s quality and mutates the pipeline when quality is low. It implements a 7-step cycle at four recursion levels:
The Cycle: 1. Compute – UX
Attractor finds minimum-energy paths for all ventures 2.
Dream – Attractor paths feed into GigiKernel/InkGenome
(visual genome generation) 3. Feel – Score the
resulting qualia 4. Judge – Compare qualia score
against threshold 5. Mutate – If score is low, trigger
substrate evolution 6. Record – Persist cycle state to
qat_daemon.db 7. Repeat – Loop forever
(daemon mode)
Qualia Scoring:
\[Q = 0.25 \cdot C_{coherence} + 0.30 \cdot F_{flow} + 0.20 \cdot E_{ergonomics} + 0.25 \cdot R_{revelation}\]
where: - \(C_{coherence}\) measures narrative consistency (does the commercial tell a story?) - \(F_{flow}\) measures transition smoothness (is the energy profile monotonically structured?) - \(E_{ergonomics}\) measures cognitive accessibility (can a viewer follow without effort?) - \(R_{revelation}\) measures the “aha” factor (does the commercial reveal something unexpected about the product?)
Recursion Levels: - Level 0: Automates QAT computation - Level 1: Automates the automation (daemon schedules itself) - Level 2: Automates improving the automation (qualia feedback mutates the daemon’s own parameters) - Level 3: Automates the improvement of improvement (meta-qualia on daemon quality) - Level omega: The hairy ball theorem guarantees a fixed point – “cowlick corporate cognition”
The meta_qualia table tracks Level 2+ self-evaluation:
daemon_quality, ventures_improved,
ventures_degraded, and mean_energy_delta. When
the daemon improves fewer ventures than it degrades, it mutates its own
scoring weights. This is the self-referential core: the quality judge
judges its own quality.
Source:
infrastructure/commercial_compiler.py
The Commercial Compiler takes individual venture scenes and stitches
them into a single continuous flythrough. It reads animation metadata
from ventureEndpointContributions.db, follows a stitch
graph to determine narrative order, and emits timing metadata for
downstream capture systems.
Each VentureScene specifies:
loop_duration / intro_duration /
outro_duration – timingstitch_in / stitch_out – overlap windows
with adjacent scenescamera_exit / camera_enter – spatial
continuity between venturesnarrative_role – genesis, rising, climax, or
resolutionstitch_order – position in the flythroughThe Stitch object defines transitions between ventures:
type (dissolve, fly-to, pan), duration, and description. The compiler
resolves the stitch graph into a CommercialTimeline
containing:
The output is a Nought script (the commercial program) plus a Crystal Nought layer (the understanding/annotation layer).
Source: nought.py
Nought is a domain-specific language with 13 verbs. Every program
starts from nothing (void) and returns to nothing
(collapse). Division by zero is not an error – it is the
entry point.
The 13 Verbs:
| Verb | Semantics |
|---|---|
void |
Declare a scene (from nothing) |
spawn |
Create an entity (type, name, position) |
shape |
Set parameterization space values (Four’s 18 spaces) |
evolve |
Attach a time evolution rule |
recurse |
Define recursion rule (output feeds input) |
bind |
Bind to live state (CnC, SCADA, Hydra) |
cut |
Scene transition |
render |
Emit to Four/AudioVizAI |
play |
Attach audio |
speak |
Narration/dialogue via Ink |
game |
Instantiate a GameGob archetype |
hook |
Wire computation hook to C(n(C)) |
collapse |
End. Return to void. |
Nought compiles to Four.js parameter buffers. A line like:
shape singularity VOID 0.99 1.0 0.95 0.0
becomes a
ParamBuffer.set(SPACES.VOID, 0.99, 1.0, 0.95, 0.0) call,
writing 4 floats into the 72-float buffer at offset
16 * 4 = 64.
The bind verb connects parameters to live system state
via CnC bindings:
CNC_BINDINGS = {
'cnc.cFunc': ('SINGULAR', 'criticalPoint'),
'cnc.depth': ('VOID', 'entropy'),
'cnc.heads': ('EIGEN', 'dominance'),
'cnc.scada': ('ENERGY', 'radiation'),
'cnc.sessions': ('FOURIER', 'bandwidth'),
'cnc.load': ('HAMILTONIAN', 'kinetic'),
'cnc.mass': ('LAGRANGIAN', 'action'),
}When bind singularity cnc.scada appears in a Nought
script, the ENERGY.radiation parameter is wired to live SCADA telemetry.
The commercial responds in real time to actual system state. This is not
pre-rendered video – it is a live instrument.
Source:
ventures/precisionautodoors_com/.deploy/four.js
Four.js is the parametric rendering engine. Its function signature is its name:
f(name, interface, rules_over_time, rules_under_recursion)
It takes 72 floats (18 vec4 parameterization spaces), applies TimeRules (differential equations) and RecursionRules (self-referential feedback), and emits WebGL frames via a GLSL shader pipeline. Section 3 covers the 72-float basis in detail.
The ParamBuffer is a Float32Array(72)
uploaded to the GPU as uniform vec4 u_p[18]. TimeRules
evolve parameters per frame via tick(params, t, dt).
RecursionRules map outputs back to inputs via
apply(params). The shader reads all 18 spaces through
#define semantic accessors:
#define AESTHETIC P_MOBLEYAN.x
#define DARKNESS P_VOID.x
#define CONVERGENCE P_SINGULAR.xThe rendering loop is:
TimeRules.tick() --> RecursionRules.apply() --> ParamBuffer.upload() --> draw().
Every frame is a pure function of the 72 floats. There is no hidden
state.
Source:
infrastructure/lumen_brain.py
Lumen is a thin Swift client (headless WebKit) controlled by a Python intelligence layer via Unix domain socket IPC. It navigates to the Four.js rendering, captures frames at specified intervals, and streams them to the packaging stage.
The LumenBrain Python class provides:
navigate(url) – load the Four.js commercial pagescreenshot(path) – capture the current framewatch_events() – monitor attention events (dwell time,
element focus)fill(), get_forms() – interact with UI
elements for product demo captureLumen’s role is to turn the live WebGL rendering into a captured artifact. It runs on macOS via CoreGraphics, requiring no display server. The commercial can be rendered on a headless build machine.
The final stage packages the captured frames into distributable formats (MP4, WebM, GIF for social), generates thumbnails at capture windows defined by the Commercial Compiler, and deploys to venture landing pages via mascom-edge. Each venture’s landing page background animation IS the commercial – visitors see it live, not as a video embed.
The central claim of Four.js is that 72 floats – 18 vec4 parameters – form a complete basis for visual reality. Not a complete basis for all possible images (that would require infinite dimensions), but a complete basis for all visually meaningful parametric animations. The spaces are:
| # | Space | Components | Domain |
|---|---|---|---|
| 0 | PARAM | time, dt, frame, seed | Base coordinates |
| 1 | HYPER | adaptRate, decayRate, momentum, temperature | Meta-control |
| 2 | ORTHO | x, y, z, w | Independent axes |
| 3 | TANGENT | velocity, acceleration, jerk, curvature | Derivatives |
| 4 | NORMAL | c0, c1, c2, c3 | Constraints |
| 5 | EIGEN | mode0, mode1, mode2, dominance | Principal components |
| 6 | HAMILTONIAN | kinetic, potential, total, phase | Energy conservation |
| 7 | FOURIER | freq0, freq1, freq2, bandwidth | Frequency domain |
| 8 | ENERGY | thermal, kinetic, potential, radiation | Energy states |
| 9 | PHOTON | wavelength, intensity, scatter, absorption | Light |
| 10 | PHONON | vibFreq, vibAmp, vibPhase, damping | Vibration |
| 11 | TACTILE | roughness, metalness, subsurface, displacement | Material |
| 12 | LAGRANGIAN | action, constraint, multiplier, variation | Least action |
| 13 | MOBLEYAN | aesthetic, mood, style, vision | Creative axis |
| 14 | MOBIUS | loopPhase, fixedPoint, orientation, twist | Self-reference |
| 15 | INFZERO | limit, approach, asymptote, epsilon | Limits |
| 16 | VOID | darkness, silence, absence, entropy | Negative space |
| 17 | SINGULAR | convergence, criticalPoint, phaseDelta, bifurcation | Phase transitions |
TimeRules define how parameters evolve each frame. They are discrete
differential equations evaluated in the tick() method:
tick(params, t, dt) {
for (const r of this.rules) {
const cur = params.data[r.space * 4 + r.component];
const next = r.fn(t, dt, cur, params);
params.setComponent(r.space, r.component, next);
}
}The Hamiltonian method implements Hamilton’s equations directly:
\[\frac{dq}{dt} = \frac{\partial H}{\partial p}, \quad \frac{dp}{dt} = -\frac{\partial H}{\partial q}\]
computed via finite differences with \(\epsilon = 10^{-4}\). This ensures energy-conserving dynamics: the total energy in the HAMILTONIAN space remains constant (up to numerical precision), giving animations physical plausibility without explicit physics simulation.
The Fourier method generates overtone evolution: freq0
modulates at 0.3 Hz, freq1 at 0.7 Hz (2x base),
freq2 at 1.1 Hz (4x base). The Mobleyan method modulates
aesthetic at the golden ratio:
mood = 0.5 + 0.3 * cos(t / period * 1.618).
RecursionRules map one parameterization space to another, creating feedback loops:
| Rule | From | To | Semantics |
|---|---|---|---|
energyRadiates |
ENERGY.thermal | PHOTON.intensity | Energy becomes light |
lightRevealsMatter |
PHOTON.intensity | TACTILE.roughness | Light reveals material |
loopSelectsModes |
MOBIUS.loopPhase | EIGEN.dominance | The loop’s phase selects principal modes |
transitionOpensVoid |
SINGULAR.convergence | VOID.darkness | Phase transitions open the void |
aestheticConstrains |
MOBLEYAN.aesthetic | NORMAL.c0 | Aesthetic constrains what is allowed |
These are not arbitrary wiring. They encode physical and perceptual relationships: energy radiates as light, light reveals material, phase transitions open voids. The result is that changing one parameter causes physically and aesthetically coherent cascading changes across the entire 72-float state.
Space 13 – MOBLEYAN – is unique. Its four components (aesthetic,
mood, style, vision) have no physical analog. They are purely creative
parameters that modulate everything else through RecursionRules. When
aesthetic is high, NORMAL constraints tighten (via
aestheticConstrains). When mood oscillates at
the golden ratio, MOBIUS twist accumulates proportionally.
This makes aesthetic quality a first-class computational parameter, not a post-hoc judgment. The commercial’s beauty is not evaluated after rendering – it is an input to the rendering. The QAT Daemon adjusts MOBLEYAN values until the qualia score converges.
The MOBIUS parameterization space (loopPhase, fixedPoint, orientation, twist) is not just a rendering parameter – it is the mathematical structure of the entire pipeline. The commercial generation process is a fixed point:
\[G(P) = P\]
where \(G\) is the generation function (the pipeline) and \(P\) is the product (the pipeline’s output). The commercial for Four.js is rendered by Four.js. The commercial for Nought is scripted in Nought. The commercial for Scry was discovered by Scry.
This is not circular reasoning. It is a convergent fixed-point iteration. The QAT Daemon runs the loop until the qualia score stabilizes:
\[Q_{n+1} = QAT(Commercial(Nought(UXAttractor(Scry(Product)))))\]
Convergence is guaranteed by the contraction mapping principle: each iteration’s energy is bounded below by zero (you cannot have negative transition energy) and the QAT mutation step is bounded in magnitude.
The meta_qualia table in qat_daemon.db
records the daemon’s self-evaluation. At Level 2, the daemon scores its
own scoring function:
If ventures_degraded > ventures_improved, the daemon
mutates its own qualia weights. The scoring function evolves to produce
better commercials. The commercial about the QAT Daemon would show this
evolution happening – and it does, because the QAT Daemon is part of the
product being advertised.
Every Nought program ends with collapse. Collapse
returns the universe to void. Void is also the first verb. The
commercial is a cycle:
void genesis --> the beginning
...
collapse --> the end = the beginning
This is the MOBIUS space made narrative. The viewer who watches the
commercial to completion arrives back at the starting frame. The product
demo that ends is also the product demo that begins. In
commercial.nought, the final scene (punchline)
shapes the SINGULAR space to convergence 1.0 and the VOID space to all
zeros:
shape omega SINGULAR 1.0 1.0 1.0 1.0
shape omega VOID 0.0 0.0 0.0 0.0
Full convergence meets total void. The bifurcation point is the
genesis point. collapse is void.
The Commercial Compiler processes 125 ventures (of 145 total – 20 are pre-scaffold and have no filmable UI). Each venture contributes a scene with duration determined by its complexity and narrative role. Total compiled duration: approximately 1682.5 seconds (28 minutes, 2.5 seconds).
The stitch graph is not alphabetical or random. It follows a
narrative arc defined by narrative_role:
Each Stitch between scenes defines camera continuity.
The camera_exit of venture N matches the
camera_enter of venture N+1, creating spatial coherence
across the flythrough. The viewer moves through 125 products as if
moving through one continuous space.
Layered atop the visual Nought is Crystal Nought – the understanding layer. While Nought controls what renders, Crystal Nought annotates what it means. Each venture scene carries:
The Crystal layer is optional during playback but essential for the commercial’s dual purpose: it is both a 28-minute brand film and a 125-entry product catalog.
Traditional commercial production for a software portfolio of 125 products would cost:
The Commercial Genesis Engine cost:
From codebase to commercial in one command:
python3 commercial_compiler.py --compileThis triggers the full pipeline: Scry indexes the venture, UX Attractor computes the minimum-energy path, QAT scores the initial qualia, Nought emits the script, Four.js renders, Lumen captures, and Present packages. The entire process is idempotent – running it twice produces the same commercial unless the product has changed, in which case it produces an updated commercial.
Commercials improve themselves. After initial generation, the QAT Daemon runs in continuous mode:
python3 qat_daemon.py --daemonEach cycle re-evaluates qualia scores, mutates low-scoring ventures’
UX Attractor parameters, and regenerates their commercial segments. Over
time, the conglomerate commercial converges to a local optimum in qualia
space. The daemon’s meta_qualia self-evaluation ensures the
convergence criterion itself improves.
WeylandAI (a portfolio venture for document extraction) was the first venture to receive a fully automated commercial via this pipeline. The UX Attractor was configured with WeylandAI’s component graph (upload -> extract -> review -> export). The minimum-energy path became the commercial script. The attractor path mapped story roles to visual moods (opening = cold/text, climax = electric/extraction, resolution = golden/export). InkGenome candidates were generated, scored, and the best was rendered. Total human involvement: zero.
The Commercial Genesis Engine collapses the distinction between
“making software” and “advertising software.” Every commit that changes
the product automatically changes the commercial. The advertising budget
is the compute budget. The marketing department is
qat_daemon.py --daemon.
QAT’s thesis is that qualia scoring IS quality assurance. A commercial that looks bad indicates a product whose UX has high transition energy. A commercial with poor narrative flow indicates a product whose user journey is disjointed. The commercial is not just marketing – it is a diagnostic. Low qualia score triggers substrate evolution, which improves the product, which improves the commercial. The test and the advertisement are the same artifact.
The pipeline generates documentation as a side effect. Crystal Nought annotations explain what each product does and why. The UX Attractor’s energy map shows which flows are smooth and which are rough. Scry’s index reveals the product’s architectural center of gravity. The commercial is a navigable map of the product’s capabilities, ranked by structural importance and experiential quality.
The Mobleysoft model: 145 ventures, each with an automatically generated commercial, updated continuously, scored and improved by a self-conscious daemon, rendered by the same engine that powers the products. The marginal cost of adding venture 146 is zero – Scry discovers it, the Attractor paths it, Nought scripts it, Four renders it, Lumen captures it, Present ships it. The conglomerate commercial extends by one scene.
This is the conglomerate advantage made computational. A single company with 145 products cannot afford 145 production teams. But it can afford one pipeline that generates all 145 commercials from the products themselves.
The Commercial Genesis Engine is a fixed point. The pipeline generates commercials using the technology the commercials advertise. The parametric rendering engine renders its own commercial. The search DSL discovers its own features. The qualia scorer scores its own scoring. The language of nothing generates everything, then collapses back to nothing.
This self-reference is not philosophical decoration. It is the engineering architecture. Each component of the pipeline is simultaneously a product being advertised and a tool doing the advertising. The 72 floats that describe any visual frame are the same 72 floats that describe the commercial about those 72 floats. The Hamiltonian that conserves energy in the rendering is the same Hamiltonian that minimizes energy in the UX flow.
The medium is the message. The commercial was generated by the technology it advertises.
| File | Role |
|---|---|
scry.py |
Parametric search DSL and N-layer index |
infrastructure/ux_attractor.py |
Hamiltonian energy minimization over UI state graphs |
infrastructure/qat_daemon.py |
Qualia Attractor Theory self-conscious loop |
infrastructure/commercial_compiler.py |
Conglomerate flythrough stitching |
nought.py |
The 13-verb DSL |
ventures/precisionautodoors_com/.deploy/four.js |
72-float parametric renderer |
infrastructure/lumen_brain.py |
Swift headless browser IPC control |
commercial.nought |
The self-referential commercial script |
Paper 132 of the MASCOM Research Series. Mobleysoft, 2026.