The Operating System
"White Computer" names a system: the convergence of computing infrastructure, surveillance technology, and algorithmic governance that encodes white supremacy into the architecture of digital power. This is not metaphor. It is material—contracts, code, data flows, epistemological frameworks, and historical lineages that produce racial hierarchy through technology.
The term captures something specific: not merely that technology has bias, but that whiteness operates as the operating system—the invisible default that structures what counts as normal, legitimate, and worthy of protection. Everything else becomes deviation, risk, target.
I. The Unbroken Line
Surveillance Begins with Slavery
The history of computing is inseparable from the history of racial classification and colonial administration. Computing was invented for control.
Simone Browne's Dark Matters: On the Surveillance of Blackness traces surveillance technology to the transatlantic slave trade. The slave ship was a surveillance apparatus: bodies counted, categorized, tracked, valued by weight and health. The "lantern laws" of 18th-century New York required Black, mixed-race, and Indigenous people to carry lanterns after dark—early biometric identification, making certain bodies visible to power while whiteness moved freely in darkness.
This lineage continues unbroken:
- Slave passes: Documentation required for Black movement
- Fugitive slave patrols: Organized surveillance of Black mobility
- Jim Crow registration: Racial categorization as law
- Japanese internment: Mass surveillance and removal of an entire ethnic group
- COINTELPRO: FBI surveillance of Black liberation movements
- Modern ICE: Algorithmic deportation
The throughline: surveillance technologies emerge from the need to control non-white bodies.
Colonial Administration as Data Science
When the British administered India, they conducted the most extensive data collection project in human history. The colonial census didn't merely count—it classified. Caste, religion, tribe, language—categories that often didn't exist as fixed identities became rigid administrative boxes.
This wasn't neutral enumeration. It was epistemic violence—forcing fluid identities into legible categories that served colonial control. When a census-taker classified someone as a particular caste, that classification became legally binding, affecting land rights, employment, marriage.
The Macaulay Minute of 1835 made the logic explicit: create "a class of persons Indian in blood and colour, but English in tastes, in opinions, in morals and in intellect." Education as data transformation—converting colonized subjects into compliant intermediaries.
This is the template: data collection as a mechanism for producing subjects amenable to power.
II. The Architecture of Genocide
IBM and the Holocaust
Edwin Black's IBM and the Holocaust documents how computing technology enabled the Nazi genocide. This isn't conspiracy theory—it's documented corporate history.
The Technical Architecture:
- The Hollerith machine (precursor to the computer) used punch cards to process census data
- Each column represented a different attribute: religion, nationality, profession, address
- Sorting and tabulating machines could identify individuals matching any combination of criteria
The Business Model:
- IBM's German subsidiary, Dehomag, held the monopoly on tabulating equipment
- Custom punch card systems were designed for specific Nazi purposes
- IBM engineers trained Nazi bureaucrats in system operation
- The company leased rather than sold machines, maintaining ongoing relationships
The Application:
- 1933 census: Identified Jews, Roma, and other "undesirables" by cross-referencing religious, ethnic, and professional data
- Deportation logistics: Railroad scheduling, camp population management
- Concentration camp administration: Prisoner tracking, labor allocation, death recording
- The famous tattoo numbers at Auschwitz corresponded to Hollerith punch card categories
The Lesson: The Holocaust was not primarily an act of passion—it was an act of administration. The gas chambers required logistics. The logistics required data. The data required technology. IBM provided the technology.
This is what "White Computer" means at its most literal: a computing system designed to identify, categorize, track, and ultimately eliminate non-white (and other "undesirable") bodies. The same logic—categorization, sorting, tracking, removal—continues today.
The Banality of Technical Evil
IBM's corporate culture didn't consider itself complicit in genocide. Engineers designed efficient systems. Salespeople met quotas. Managers optimized operations. Everyone was just doing their job.
This is the banality of technical evil: systems become so complex, so distributed, that no individual feels responsible. The punch card designer doesn't see Auschwitz. The algorithm engineer doesn't see the deportation. The data architect doesn't see the family separation.
Yet the system sees. And the system acts.
III. What Counts as Knowledge
The Myth of Neutrality
Technology is presented as neutral—tools that can be used for good or evil. This framing obscures how tools encode values in their design.
The Camera: When photography was invented, it was calibrated for white skin. Film chemistry, lighting standards, color balance—all optimized for white subjects. Kodak's "Shirley cards" (used to calibrate skin tones) featured only white women until the 1990s. Darker skin was rendered poorly or not at all. This wasn't accidental; it reflected who mattered to the industry.
Facial Recognition: Modern facial recognition systems inherit this bias. Joy Buolamwini's "Gender Shades" research found that facial recognition systems misclassified darker-skinned women up to 34% of the time, compared to less than 1% error for lighter-skinned men. This isn't a bug—it's the system working as designed, reproducing the same hierarchy the camera established.
The Default User: Software assumes a default user: English-speaking, Western, educated, able-bodied, male, white. When systems "work," they work for this user. Others are edge cases, bugs, exceptions.
Data as Colonial Extraction
Nick Couldry and Ulises Mejias's The Costs of Connection introduces the concept of data colonialism:
"Data colonialism combines the predatory extractive practices of historical colonialism with the abstract quantification methods of computing."
The logic:
- Historical colonialism: Extracted natural resources, labor, and land from colonized territories
- Data colonialism: Extracts behavioral data, attention, and social relations from users
The parallels:
- Dispossession: You don't own your data; platforms do
- Extraction: Value flows from periphery (users) to center (corporations)
- Dependency: Services become essential, creating lock-in
- Epistemic violence: Local knowledge systems subordinated to quantification
The Epistemology of Surveillance
Surveillance systems don't just collect data—they produce knowledge. But what kind of knowledge?
Positivist Epistemology assumes knowledge is objective, measurable, quantifiable. This privileges:
- Quantitative over qualitative data
- Measurable over experiential knowledge
- Individual over collective understanding
- Western over non-Western knowledge systems
Racial Epistemology: White supremacy operates through epistemic injustice—systematically excluding certain ways of knowing. When algorithmic systems make decisions, they're using one epistemology to override others. This isn't neutral—it's a form of epistemic violence.
Safiya Noble's Algorithms of Oppression demonstrates this through Google search. Searching "black girls" returned pornographic results. Searching "professional hairstyles" showed white women; "unprofessional hairstyles" showed Black women. The algorithm wasn't making mistakes—it was reflecting and reinforcing the epistemology encoded in its training data.
IV. The Ideology Behind the Machine
Peter Thiel and the Philosophy of Palantir
Peter Thiel—PayPal co-founder, Facebook's first outside investor, and Palantir co-founder—is not merely a tech entrepreneur. He is an ideologue whose philosophy shapes the companies he builds.
Key Thiel Positions:
- "I no longer believe that freedom and democracy are compatible" (2009 essay)
- Support for "seasteading"—creating new nations on the ocean beyond democratic governance
- Major funder of neoreactionary and "Dark Enlightenment" thinkers
- Close association with Curtis Yarvin (Mencius Moldbug)
- Belief that technological progress requires escaping democratic constraints
The Thiel Network:
- Palantir: Named for the "seeing stones" in Lord of the Rings—objects that enable surveillance across vast distances
- Founders Fund: Investment in technologies that can reshape society without democratic input
- Thiel Fellowship: Pays young people to drop out of college—building a network outside traditional institutions
This isn't neutral entrepreneurship. It's ideological infrastructure: building companies that embody anti-democratic philosophy.
Curtis Yarvin and the Dark Enlightenment
Curtis Yarvin (pen name: Mencius Moldbug) is the intellectual architect of "neoreaction"—a movement that explicitly rejects Enlightenment values of equality, democracy, and human rights.
Key Neoreactionary Ideas:
- Democracy is a "memetic disease" that must be cured
- The "Cathedral"—a conspiracy of universities, media, and bureaucracy that enforces progressive ideology
- Advocacy for "formalism"—explicit corporate sovereignty replacing democratic government
- Racial hierarchy as natural and desirable
- Technology as the means to escape democratic constraint
Yarvin has been funded by Thiel-associated organizations. He met with Trump administration officials during the transition. Neoreactionary ideas have influenced tech leadership thinking.
TESCREAL: The Ideological Bundle
Émile Torres and Timnit Gebru coined "TESCREAL" to describe an interconnected bundle of ideologies dominant in Silicon Valley:
- Transhumanism: Enhancing humans through technology
- Extropianism: Overcoming human limitations
- Singularitarianism: AI achieving superintelligence
- Cosmism: Colonizing the universe
- Rationalism: LessWrong-style "reasoning"
- Effective Altruism: Maximizing good through calculation
- Longtermism: Prioritizing far future over present
The Eugenics Connection:
- Many TESCREAL figures have expressed interest in genetic engineering
- "Human capital" thinking reduces people to optimization targets
- "Effective" giving often means wealthy tech figures deciding who deserves help
- "Longtermism" can justify present harm for speculative future benefit
How This Connects to White Computer:
- TESCREAL ideology justifies building systems of surveillance and control
- "Rationalist" thinking naturalizes algorithmic decision-making
- Future-orientation devalues present harms to specific communities
- Elite decision-making replaces democratic accountability
V. The Palantir-ICE Surveillance Complex
The Technical Stack
Gotham: Intelligence analysis platform for government agencies. Links disparate databases, performs link analysis, builds profiles from data fragments. Originally designed for counterterrorism; now used for immigration enforcement.
Foundry: Enterprise data integration platform. Connects siloed systems across agencies. Creates unified data pipelines. Enables cross-agency queries. The backbone for potential "mega-database" of Americans.
FALCON: Mobile field operations app. Real-time ID scanning, GPS tracking, phone metadata extraction, database queries from the field. Imports data from seized devices. No automatic deprovisioning: accounts persist after personnel leave.
ImmigrationOS: New system (2025). ~$60 million contract. "Near real-time visibility" into visa overstays, self-deportations. Lifecycle tracking: identification through removal. Enforcement prioritization algorithms.
ELITE (Enhanced Leads Identification & Targeting for Enforcement): Targeting tool revealed in January 2026. Maps neighborhoods with high concentration of potential targets. Generates "dossiers" on individuals. Assigns "confidence scores" to addresses. Data sources include HHS, USCIS, Medicaid, commercial databases. Uses skip tracers and bounty hunters to verify addresses. Geographic profiling: entire neighborhoods become targets.
The Data Sources
Government Databases:
- USCIS (visa, citizenship applications)
- CBP (border crossing records)
- DHS (homeland security intelligence)
- IRS (tax records—new memorandum allows sharing)
- SSA (Social Security records)
- HHS (health and welfare data, including Medicaid)
- SEVIS (student and exchange visitor information)
- State DMVs (driver's license photos, addresses)
Commercial Databases:
- CLEAR (Thomson Reuters): background check data
- LexisNexis: public records aggregation
- Data brokers: Acxiom, Experian, etc.
- Skip tracers and bounty hunters: address verification
- Social media: Facebook, Instagram, TikTok
Biometric Systems:
- Facial recognition (Clearview AI contract)
- Fingerprint databases (IDENT, IAFIS)
- Iris scanning
- DNA collection
The Business Model
Sole-Source Contracting:
- Contracts awarded without competitive bidding
- "Urgent and compelling need" justification
- Palantir designated as "only source" capable of delivery
- Bypasses democratic oversight
Vendor Lock-In:
- Proprietary systems make switching costly
- Deep integration with agency workflows
- Institutional knowledge concentrated in Palantir staff
- Incentive structure: more surveillance = more revenue
Internal Developments (2025)
Code of Conduct Changes:
- Removed explicit language about avoiding bias based on race, national origin, citizenship
- "Protect the Vulnerable" section weakened
- Replaced with generic "characteristics protected by law" language
- Signal of ideological alignment with aggressive enforcement
VI. How Systems Reproduce Racial Hierarchy
The Mechanics of Algorithmic Racism
Ruha Benjamin calls this the "New Jim Code": systems that appear neutral but systematically disadvantage non-white populations. Understanding the mechanics is essential.
Layer 1: Data Collection Bias
Data doesn't emerge from nowhere—it's collected through specific institutions with specific histories:
- Criminal justice data: Reflects over-policing of Black and Latinx neighborhoods. More police presence → more arrests → more data → appears to justify more policing.
- Immigration data: Enforcement priorities determine who gets documented. "Priorities" are policy choices, not neutral descriptions.
- Welfare data: People who use public services become visible to the state in ways those with private resources don't.
- Health data: Medicaid recipients are now being fed into ICE's ELITE system. Using public healthcare becomes evidence for deportation.
Layer 2: Feature Engineering Bias
Machine learning requires "features"—variables the algorithm uses to make predictions. Feature selection encodes assumptions:
- Address stability: Coded as positive. But housing instability correlates with poverty, which correlates with race.
- Employment history: Gaps penalized. But employment discrimination means minorities have more gaps.
- Social connections: "Known associates" flags. But segregated social networks mean being connected to one targeted person flags everyone in that network.
- Name patterns: Even without explicit race variables, names carry ethnic signals algorithms can detect.
Layer 3: Proxy Variables
When direct discrimination is illegal, algorithms use proxies:
- Zip code: Predicts race due to residential segregation
- School attended: Predicts race and class
- Credit history: Reflects historical redlining
- Social media activity: Reflects network segregation
- Language patterns: African American Vernacular English triggers different algorithmic treatment
Research shows GPT-4 and similar models recommend harsher sentences for defendants using AAVE—without being told anything about race.
Layer 4: Confidence Scoring
ICE's ELITE system assigns "address confidence scores"—how certain the system is that a person lives at a given address. This appears neutral but:
- Stable addresses with long residency history score higher
- Addresses linked to multiple data sources score higher
- Addresses in areas with good commercial data coverage score higher
All of these correlate with wealth and whiteness. Poor, immigrant, and minority communities have less stable addresses, less commercial data coverage, more reliance on informal arrangements. The algorithm interprets this as uncertainty, which becomes suspicion.
Layer 5: Feedback Loops
The most insidious feature: systems learn from their own outputs.
- Predictive policing algorithm identifies "high crime" area
- More police deployed to area
- More arrests in area
- Algorithm sees more crime data
- Confirms area is "high crime"
- More police deployed
- Repeat forever
This is not learning—it's confirmation bias at scale. The algorithm never tests whether crime exists in areas it doesn't police. It only sees what enforcement makes visible.
The Illusion of Objectivity
The power of algorithmic systems lies in their appearance of neutrality. A human officer deciding who to arrest can be accused of bias. A computer making the same decision appears objective.
But the computer isn't objective—it's systematically biased while appearing neutral. This is worse than explicit discrimination because:
- Harder to detect: Bias is hidden in code, training data, and opaque models
- Harder to challenge: "The algorithm decided" diffuses responsibility
- Easier to scale: Algorithmic decisions happen millions of times without human review
- Self-reinforcing: Feedback loops entrench bias over time
When ELITE identifies a neighborhood as a deportation target, it's not making a neutral observation—it's producing racial geography through algorithmic means.
VII. Technology as Fascism's Infrastructure
What Fascism Actually Is
Fascism isn't just "strong government" or "bad politics." It's a specific political form.
Classical Features (1920s-1940s):
- Ultranationalism: The nation as supreme value
- Racial hierarchy: Some peoples superior, others inferior
- Scapegoating: Blame internal/external enemies for problems
- Cult of violence: Glorification of strength, war
- Mass mobilization: Rallies, symbols, belonging
- Corporate-state fusion: Business and government merge
- Propaganda: Control of narrative
Contemporary Adaptations (21st century):
- Algorithmic nationalism: Systems define who belongs
- Automated racial hierarchy: Encoded in infrastructure
- Algorithmic scapegoating: "Threats" identified by data
- Administrative violence: Deportation, detention, separation
- Platform-state fusion: Tech companies as governance
How Surveillance Technology Enables Fascism
1. Mass Identification
Historical fascism required enormous bureaucratic effort to identify targets. Nazi Germany needed census workers, punch card operators, clerks. Modern systems automate this:
- Facial recognition identifies individuals from photos
- License plate readers track vehicle movement
- Cell phone tracking locates people in real time
- Biometric databases enable instant identification
What took the Nazis years can now be done in minutes.
2. Categorical Power
Fascism requires categories: who belongs, who doesn't. Surveillance produces these:
- "Legal" vs "illegal" immigrant
- "Citizen" vs "non-citizen"
- "Threat" vs "non-threat"
These appear technical, neutral. But they encode political decisions about who deserves rights.
3. Administrative Violence
Modern systems enable quiet violence:
- Deportation: Removal from home, family
- Detention: Imprisonment without conviction
- Family separation: Breaking bonds
- Economic exclusion: Inability to work
This violence is automated: algorithms select targets, generate lists, schedule operations.
4. Preemptive Control
Fascism eliminates threats before they materialize:
- Predictive policing: Arrest before crimes
- Risk scoring: Deny services based on predictions
- Threat assessment: Target people for who they might become
ELITE's "confidence scores" are about prediction, not punishment for actions.
5. Normalization of Exception
Surveillance normalizes emergency powers:
- "Urgent need" bypasses oversight
- "National security" justifies secrecy
- What was exceptional becomes routine
The Historical Warning
IBM didn't build machines for genocide—they built them for census. But when the Nazis needed to identify Jews, the technology was ready.
COINTELPRO wasn't built for political persecution—it was built for security. But the infrastructure enabled abuse.
By the time fascism is unmistakable, the infrastructure is already operational. The time to resist is before the exceptional becomes normal.
VIII. The Synthesis
Achievement Culture and the Surveillance Subject
The achievement trap I documented in Asian immigrant communities has a structural complement in White Computer:
Achievement as Surveillance Input:
- Educational metrics (GPA, test scores) become data
- Employment history becomes algorithmic profile
- Financial behavior becomes risk score
- The achieving subject produces the data that enables their own surveillance
The Model Minority as Ideal Surveillance Subject:
- Compliant: Doesn't resist authority
- Documented: Leaves paper trail of achievement
- Visible: Participates in formal systems
- Productive: Generates economic value
This is what makes the model minority myth functional for White Computer: it produces subjects who are simultaneously high-value (for labor extraction) and high-legibility (for surveillance).
The Double Bind:
- Achieve → Become more legible to systems of control
- Don't achieve → Face economic precarity, lack of protection
- Either way, the system wins
The Digital Twin as Weapon
The digital twin gap—the mismatch between simulation and reality—becomes weaponized in surveillance systems:
The Palantir Twin:
- ICE builds "digital twins" of targets from fragmented data
- Address history, travel records, associations, biometrics
- This profile is the person for enforcement purposes
- The actual human becomes irrelevant
The Gap as Violence:
- Profile says "threat" → person experiences violence
- Profile says "wrong address" → wrong person detained
- Profile says "gang member" → teenager with wrong tattoo arrested
- The gap between simulation and reality is where lives are destroyed
Who Owns the Twin?
- Your digital twin in these systems is not yours
- You cannot see it, correct it, challenge it
- It exists for the convenience of power
- Your "twin" can be used against you without your knowledge
The Access Economy Inverted
The access economy—where we rent rather than own—finds its dark inversion in surveillance systems:
Access to Rights:
- You don't own citizenship—you have conditional access
- You don't own privacy—you rent invisibility
- You don't own identity—it's constructed by systems
- You don't own presence—location is tracked and controllable
Performance for Access:
- Perform legal status → access to remaining in country
- Perform non-threat → access to freedom from surveillance
- Perform compliance → access to services without triggering flags
- Identity becomes performance; performance is survival
IX. What Must Be Done
1. Name the System
- "Bias" is insufficient—this is structural racism
- "Error" is misleading—the system works as designed
- "Surveillance" is incomplete—this is racial infrastructure
- White Computer: name the system to see the system
2. Expose the Mechanisms
- Technical: How do algorithms encode hierarchy?
- Material: Who owns, profits, controls?
- Epistemological: Whose knowledge counts?
- Political: What power does this serve?
3. Build Alternatives
- Data sovereignty: You own your data
- Community control: Affected people design systems
- Transparency: No black box decisions over lives
- Rights-based design: Dignity is non-negotiable
4. Organize Collectively
- Individual resistance is necessary but insufficient
- The system is designed to atomize opposition
- Collective action can challenge infrastructure
- Solidarity across affected communities is essential
The Stakes
White Computer is:
- Already causing harm: Deportations, detentions, family separations
- Expanding in scope: From immigration to broader surveillance
- Normalizing exception: Emergency powers becoming routine
- Building infrastructure: For whatever political will emerges
The question isn't whether this is happening—it's whether we recognize it in time.
IBM's punch card systems weren't built for genocide. They were built for census, for efficiency, for modern administration. But when the Nazis needed to identify Jews, the technology was ready.
The systems being built now—ImmigrationOS, ELITE, Foundry, the expanding web of surveillance—aren't being built for fascism. They're being built for "efficiency," for "security," for "immigration enforcement."
But the infrastructure is accumulating. The capabilities are expanding. The normalization is proceeding.
The question isn't whether White Computer is being built. It's already operational.
The question is: What happens when it meets political will to use it without restraint?
By the time fascism is unmistakable, the infrastructure is operational. The time to resist is now.
Further Reading
- Simone Browne, Dark Matters: On the Surveillance of Blackness — Historical connection between slavery and modern surveillance
- Ruha Benjamin, Race After Technology — "The New Jim Code" and discriminatory design
- Safiya Noble, Algorithms of Oppression — How search engines reproduce racial hierarchy
- Edwin Black, IBM and the Holocaust — Computing technology enabling genocide
- Nick Couldry & Ulises Mejias, The Costs of Connection — Data colonialism
- Cedric Robinson, Black Marxism — Racial capitalism as co-constitutive system