Ladies + AI Summit 2.0: What Stayed With Me
The toughest a part of the Ladies + AI Summit 2.0 wasn’t deciding what to attend. It was accepting what I’d should miss.
The schedule was so packed it was unimaginable to do every little thing “proper.” There have been too many classes I needed to attend, too many individuals I needed to speak to, and little or no room to breathe between them. At one level, I made a acutely aware option to step out.
I ended up sitting with two of my favourite folks to play hooky with, Sunny Eaton and Lori Gonzalez, speaking as a substitute of listening. The dialog drifted, because it typically does at good conferences, from instruments to penalties. We circled round how methods like ChatGPT complicate the thought of a “cheap expectation of privateness.” We deal with these instruments like personal conversations, despite the fact that they aren’t. That hole—between how these methods really feel and the way they really work—is the place a lot of the threat lives.
Virtually everybody else stayed put. The classes have been too good to overlook. And nonetheless, that dialog lingered. It was a reminder that even at a tightly programmed convention, among the most significant moments come from selecting the place to spend your consideration.
That pressure—between construction and spontaneity—outlined the weekend for me. In a approach, it mirrored the bigger conversations we have been having about AI itself: how a lot to automate, when to pause, and the way to decide on intentionally within the face of overwhelming chance.
What Saved Displaying Up
Wanting again on the schedule, it could be straightforward to explain the summit as a development from talks to workshops to hands-on constructing. However what stayed with me have been the questions that saved resurfacing.
One of many clearest throughlines was AI literacy—not fluency with instruments, however understanding. How these methods behave. The place they fail. And the way a lot company we hand over once we use them. A number of talks traced turning factors: concern giving strategy to curiosity, skepticism shifting into discernment. There was a shared recognition that opting out isn’t impartial. Literacy permits engagement to be intentional moderately than reactive.
Because the day shifted from listening to constructing, the emphasis moved from instruments to workflows. Probably the most attention-grabbing conversations weren’t about intelligent outputs. They have been about boundaries and judgment. Not simply what will be automated, however what must be.
Ethics confirmed up not as philosophy, however as apply—particularly round knowledge high quality and provenance. “Bias in, bias out” wasn’t a slogan. It was a warning. The priority wasn’t solely what AI produces, however what we feed it: whose experiences are represented, which sources are trusted, and the way rapidly flawed assumptions scale as soon as embedded in a system.
That thread carried straight into entry to justice. AI wasn’t framed as a magic repair. If something, there was a sober recognition that poorly designed methods can widen gaps as simply as shut them. Entry to justice wasn’t a mission assertion. It was a design constraint.
Beneath all of it was governance—not as a future coverage query, however as one thing already underway. The folks selecting distributors, setting inner requirements, and defining acceptable use are shaping the longer term in actual time. Governance defaults to whoever is within the room.
Taken collectively, the summit wasn’t about celebrating AI. It was about accountability. About partaking with expertise in ways in which maintain up over time.
What We’re Taking House
I didn’t depart with a listing of instruments to strive. I left with a clearer framework for approaching AI work.
Literacy comes earlier than leverage. Adoption is an organizational design downside, not only a coaching concern. Ethics begins with inputs, not outputs. Entry to justice have to be constructed into methods from the start. And governance is already underway, whether or not we acknowledge it or not.
None of that’s flashy. However it’s foundational.
If there was a shift, it was this: transfer intentionally. Construct the capability to pause. Ask higher questions earlier than accelerating. The long-term influence of AI received’t be decided by how briskly we transfer, however by how thoughtfully we do.
Subscribe
Get professional insights and sensible suggestions delivered to your inbox each week.
Ladies within the Loop
Which is why it feels necessary to call one thing I’ve deliberately held till now: this was a convention targeted on and led by girls.
That mattered—not as branding, however as posture.
At many AI conferences, there’s a YOLO power: construct quick, deploy quicker, type out penalties later. The emphasis is on scale and upside, with threat handled as friction.
That wasn’t the posture right here.
As an alternative of “What can we construct?” the questions extra typically seemed like “What ought to we construct?” and “Who does this have an effect on?” There was consolation with uncertainty. Openness about tradeoffs. A willingness to confess what hadn’t labored.
Even the design decisions mirrored that care. Audio system had walk-up songs. Dolly Parton’s 9 to five marked transitions. Periods have been labeled mini, midi, and maxi—not by hierarchy, however by scale. None of it felt gimmicky. It felt intentional. Human.
This wasn’t an absence of ambition. It was a unique sort of ambition—one oriented towards sturdiness, influence, and belief.
Illustration didn’t simply change who was talking. It modified what felt value discussing.
Shaping What Will get Constructed
Cat Moon and her workforce at Vanderbilt Regulation created greater than a convention. They created an area that modeled a unique approach of partaking with AI—curious, accountable, and deliberate.
I left not feeling pressured to undertake extra instruments, however clearer concerning the accountability that comes with adopting any of them. In a discipline that usually rewards velocity, this felt like a obligatory pause.
If that is the place AI conversations are headed—extra reflective, extra inclusive, extra sincere about tradeoffs—it’s a path value investing in.
The way forward for AI isn’t formed within the summary. It’s formed in moments and weekends like this one.
This is the reason it issues who’s within the room when selections are made.
Subscribe
Get professional insights and sensible suggestions delivered to your inbox each week.


![Model United Nations Conference 2.0 at Law Centre II, University of Delhi [March 28 – 29]: Register by March 15](https://i0.wp.com/cdn.lawctopus.com/wp-content/uploads/2026/02/Model-United-Nations-Conference-2.0-at-Law-Centre-II-University-of-Delhi.jpg?w=350&resize=350,250&ssl=1)






![Internship Opportunity at AGISS Research Institute [August 2024; Online; No Stipend]: Apply by August 9!](https://i2.wp.com/www.lawctopus.com/wp-content/uploads/2024/07/Internship-Opportunity-at-AGISS-Research-Institute-July-2024.jpg?w=120&resize=120,86&ssl=1)






![Model United Nations Conference 2.0 at Law Centre II, University of Delhi [March 28 – 29]: Register by March 15](https://i0.wp.com/cdn.lawctopus.com/wp-content/uploads/2026/02/Model-United-Nations-Conference-2.0-at-Law-Centre-II-University-of-Delhi.jpg?w=120&resize=120,86&ssl=1)



