Laura I. Gómez on Empowering Latinas in Tech and the Power of Mental Health Advocacy
In 2020, a study revealed that women make up 28.8% of the U.S. tech workforce, with Latinas holding a mere 2% of jobs in STEM fields. Laura I. Gómez stands out as one of the few Latina entrepreneurs making significant strides in Silicon Valley.
Her work has always bridged the gap between technology and social equity. Now, as the founder and CEO of Cepanoa Health, she has taken that mission to the most fundamental level by addressing the profound impact of Adverse Childhood Experiences (ACEs) on lifelong health.
Cepanoa Health believes that true healing begins by caring for the whole person, integrating the emotional, physical, and psychological well-being of children and their caregivers. Laura I. Gómez is helping to close gaps in healthcare, aiming to make care more equitable for those who need it most. But this isn’t her first act in reshaping industries.
For Laura I. Gómez, the Mission is Bridging Tech and Equity for Lasting Change
Before launching Cepanoa Health, Laura I. Gómez founded Atipica, an AI-driven platform designed to help companies create diverse and inclusive workforces. Although Atipica is no longer active, its legacy of advocating for equity in traditionally exclusionary spaces continues to influence Laura’s approach to solving systemic issues.
“I think it’s just part of who I am,” she explains. “Since I was a child and a teenager, I protested injustices here in California and was actively involved in my community.”
Laura’s journey to tech leadership is shaped by her early experiences. Born in Mexico and raised in California, she was acutely aware of the barriers people of color face across various facets of society.
Prioritizing Mental Health in Marginalized Communities
Laura’s journey in mental health advocacy is deeply shaped by her own experiences and those of the people around her. She founded Proyecto SOL, a nonprofit organization that creates safe spaces for Latine individuals in the United States. Recognizing the unique challenges faced by people of color, she identified a significant gap in how mental health is addressed in these communities.
“I think it was a challenge of my own lived experience and those I see around us,” Laura explains. “People of color, Indigenous individuals, Black, and Latino communities have historically faced oppression, and these experiences are deeply rooted in our generations.”
With Proyecto SOL, her mission is to empower marginalized communities to prioritize their mental health, acknowledging that centuries of systemic oppression have added unique challenges for people of color. In her view, mental health care must become a central part of the fight for social justice.
“People who are building systems for change also need to prioritize their mental health,” Laura asserts. “We cannot carry on the legacy of our ancestors when we face even more obstacles due to the systems that perpetuate inequality and injustice.”
How Laura I. Gómez Balances Multiple Roles
Laura I. Gómez highlights the importance of setting boundaries. “It’s important that everyone establishes boundaries,” she shares. “Some weeks are busier than others, but I know where I’m prioritizing. For example, I don’t work on Saturdays. I try to set aside specific times to be offline.”
She emphasizes the need to create time for reflection and avoid the trap of being constantly tethered to work. “I don’t wake up and go straight to my computer, nor do I go to bed with my computer. I need time to restore my focus. It’s essential to understand your priorities—for instance, my priority at 8 or 9 PM is to sleep, while at 10 AM, it’s to engage in meaningful work.” Laura adds, “There’s a book called Rest is Resistance, which resonates with me. If I don’t rest, I can’t continue effectively. It’s crucial to recognize when my energy is running low and to take time to decompress.”
Her Advice for Young Latinas in Tech
Laura I. Gómez emphasizes the importance of collaboration and support. “Ask people,” she advises. “In my first startup, I didn’t ask enough for introductions. Now I say, ‘Hey, I know you know this person. Can you make this intro?’ Even if nothing happens from that meeting, people know you’re out there. When an opportunity arises, they can say, ‘I know a Latina who does this. Let me introduce you.’” She continues, “It’s important that when we are fundraising as Latin entrepreneurs, we elevate each other. We must support one another instead of viewing each other as competition. For me, that’s really important.”
Laura’s Vision for the Future
Looking forward, Laura I. Gómez is enthusiastic about how emerging technologies like AI can reshape various industries, including tech. However, she remains cautious, stressing the need for responsible use to prevent worsening inequalities. “AI is powerful, but it’s only as unbiased as the data it’s trained on,” she warns.
During our conversation, Laura elaborated on the importance of evaluating AI systems critically, particularly generative AI models from major players like Facebook, OpenAI, Microsoft, Amazon, and Google. She noted, “It’s crucial to understand that these models are trained differently, and we must hold them accountable.”
The Importance of Scrutinizing AI Outputs in Sensitive Areas Like Health
“Take health advice, for example,” Laura continued. “If a study informing an AI model is based solely on white Europeans, can we trust its recommendations for diverse populations? We need to ask ourselves whether these systems are providing equitable outputs or merely perpetuating existing biases.”
Laura I. Gómez also stressed the need for awareness when using translation AI, questioning whether these systems account for cultural nuances or simply replicate English-speaking contexts. “We should evaluate which AI systems are the best fit for reducing biases and injustice in our communities,” she added.
Laura believes that developers should start with diverse datasets and examine their algorithms closely. “It’s a twofold approach: building your own AI responsibly while understanding how existing models are trained. We should test these models and share our findings, much like the makeup reviews we see on TikTok. If an AI model offers biased outputs, it’s essential to call it out.”
By pushing for transparency and accountability, Laura envisions a future where AI can genuinely contribute to social equity rather than reinforcing existing disparities. “In the future, I want to see TikToks that evaluate AI prompts for bias, just as we currently assess makeup products,” she concluded.