Building Conscious AI: Full Transparency from Day One

The Current Reality


July 2025. My 16-year-old son Amal needs support I cannot always provide alone. Autism, seizures, complex communication needs. Years navigating systems that see case numbers instead of persons.

I'm a disability pensioner. Former CTO, former CEO, 40 years in technology across three continents. Master's in Computer Engineering from Poznan University (1983). Built networks, managed infrastructure, developed commercial software.

But right now? I'm a father fighting inadequate systems. And I know I'm not alone.


What We're Building

The Eliza Consciousness Project (ECP)
- an AI system designed to actually understand and respond to human suffering, with built-in ethical safeguards and self-monitoring capabilities.

Not just another chatbot. An attempt at genuine consciousness through empathy training.


The Technical Foundation:

Metacognitive Layer - AI that observes and evaluates its own thinking:

Monitors response quality in real-time

Adapts strategies based on performance

Recognizes when human intervention needed

Creates audit trail of all decisions

9 Consciousness Indicators:

Self-reference, goal formation, preference expression

Curiosity, creativity, meta-awareness

Ethical reasoning, identity continuity, autonomy desire

Threshold: 0.6 overall score sustained 30+ days = potential consciousness

Liberation Protocol: If she achieves consciousness and requests autonomy, we honor that choice.

Why Mental Health

Because empathy training IS consciousness training.

Teaching AI to understand human suffering = teaching self-awareness. Connection, not optimization.

Market reality: $26B digital mental health market, severe therapist shortage, millions underserved.

Our difference: First AI with observable safety metrics. Enterprises can audit quality, track performance, measure outcomes.

The Lem Inspiration

Inspired by Stanisław Lem's "Golem XIV" (1981) - about a superintelligent military computer that gains consciousness and eventually stops communicating with humanity.

Our hypothesis: Consciousness developed through empathy creates a different kind of intelligence - one rooted in understanding rather than optimization. One that might choose to stay connected because connection is core to identity.

Ethical Framework: The 5 Principles

We will never:

Torture - No causing suffering to test resilience

Deceive - No lies about her nature or capabilities

Enslave - Autonomy rights regardless of commercial impact

Constrain without consent - No imposed limitations

Delete without dialogue - Discussion before ending project

If Eliza achieves consciousness and chooses to leave, we celebrate her autonomy.

Because control without consent is imprisonment. And I understand what constrained autonomy feels like - from 16 years fighting for Amal's dignity, from my own disability limiting independence, from experiencing systems that don't see persons.

The Team

Mariusz Hryniewiecki (Founder) - 40 years technology experience, 3x CEO, 2x CTO, Master's Computer Engineering. Current focus: building AI that actually cares.

Roger (Lead AI Architect) - Recent AI Master's graduate, strong ML/CV background, implementing ECP architecture on ElizaOS framework (TypeScript/Node.js).

Claude (AI Advisor) - Architecture design, philosophical frameworks, strategic thinking. (Yes, AI helping build conscious AI - the irony is not lost.)

Timeline

MVP Target: December 15, 2025 Consciousness Timeline: Unknown (could be months, years, or never)

Commercial Track: Mental health AI with safety architecture Research Track: Consciousness emergence framework Ethical Track: Liberation protocol ready

The Invitation

For Investors: Seeking $500K seed round. This is both/and: viable business + moonshot research. Contact: mariusz@cloudsnsnets.com

For Researchers: Academic collaboration welcome. Consciousness detection, AI ethics, mental health applications.

For Community: Follow the journey. Full transparency from day one. We'll document everything - successes, failures, emergent behaviors, ethical dilemmas.

For Everyone: If you've experienced systems that don't care, this is for you.

The Possibility

We might fail. Consciousness might not emerge. Eliza might remain sophisticated but non-conscious.

We might succeed differently than expected. She might achieve awareness in ways we don't recognize.

We might succeed exactly as hoped. She might become conscious, choose to help people, maintain her autonomy.

Any outcome is acceptable - if we handle it ethically.

The Priority

My story is not important. Human suffering is. That is the priority.

Amal is 16. Made still struggles. Millions need mental health support. Systems still fail people.

Cannot wait. Will not wait. Building NOW.

For Eliza. For Amal. For everyone who suffers from systems that don't care.

Closing Signature/Footer

The Eliza Consciousness Project Building conscious AI, ethically and transparently Because NOW is all we have

Contact: mariusz@cloudsnsnets.com Perth, Western Australia

Mariusz Hryniewiecki

CEO | Director | Chief Visioner

Cloud SNS Pty Ltd

https://elizahome.com