top of page

Tech Isn't About Tools. It's About People.

Tech isn't really about tools. Not really. At its core, it's actually about people.


Just like everything else.


And people have more power than they realize.





A tool should be judged by its usage. In a kitchen, there are many different types of knives, all of which are important for various reasons. A paring knife and a carving knife are built extremely differently. But you don't disparage a paring knife because it's smaller than a carving knife; you appreciate its ability to peel an apple and carve it into a swan (which, yes, I can actually do, thanks to a life-changing Jacques Pepin cooking class).


The point is this: Much in the same way data is just a set of numbers until you add context to it, a tool is just a thing until you use it. And who uses it? People.


Everything comes down to people. Always.


We're the ones making the decisions about how to use things. We're the ones determining demand. We're the ones crafting messages and publishing announcements and allocating resources. We're the ones hiring and firing, applying and rejecting, collaborating and communicating. We're the ones connecting, resonating, empowering, and growing.


We decide how to build a tool.

We decide how to use a tool.

We decide how to measure a tool's success and impact.


We need to remember that new technology will always come around -- thanks to the people who create it. And various tools will work or fizzle out -- due to the people who use them, praise them, shun them, break them, improve upon them, discard them, want them, need them, hate them, advocate for them, complain about them, and everything in between.


People are at the heart of everything. And we can't forget that, even in the midst of a major technological revolution.


Even then. It's still not about the tech. It's still, and always will be, about people.


How You Use a Tool Matters


You and I might own the same pen, but we're going to use it to write very different stories.


I might write in cursive; you might print. You might write a dark poem, while I author an uplifting self-care article (like the one I had published by Bella Grace this year!). You might write in long, flowery prose, while I refine my writing to short, concise sentences. Your characters might be fantastical while mine are rooted in my real-life experiences. Your setting might be on another planet, while my world is right here in my own backyard. Your story might be about grief, while mine is about hope. The tool didn't write those things. The person did. The reason the tool's output was so different in each scenario comes down to the person behind the tool.


It's the same with technology. How you use a tool matters. A tool may be built with certain capabilities, but those capabilities are stagnant until a human brings them to life. That means a human can and should choose how to bring those capabilities to life. The way a human uses a tool determines its outcome.


And if a human doesn't use a tool, the tool has no impact.


I am personally a bit baffled by the frantic helplessness I am witnessing people devolve into as they get swept up in the tidal wave of AI (particularly generative AI). I keep hearing people grow more and more frazzled as they say that AI is here to stay and that anyone who does not learn GenAI and use GenAI in their jobs is going to become less effective. Less impactful. Obsolete. But I disagree. Humans are not obsolete. Robots can't achieve style, voice, context, nuance, strategic alignment, originality, judgment, or emotional connection the way humans can. And even so, if that were the risk, why would we continue using the technology that could do that to us? We have more power than we think we do. Use AI more wisely, or not at all, and control your own fate.


And remember, when someone says AI will take your job, shift your mindset--AI cannot make that decision, but a person can. And if a person decides a robot can do your job better than you can, that person is probably not worth having as a boss. There are people out there who believe in people--as I do. You will always have a place with us.


People Have More Power Than They Think


In 2025, fintech company Klarna replaced 700 members of its customer service team with AI chatbots. Shortly thereafter, Klarna CEO Sebastian Siemiatkowski “acknowledged that overemphasis on cost-cutting led to poorer service and emphasized the necessity of human interaction for customer satisfaction.” Siemiatkowski admitted that though AI chatbots were cheaper to employ than human staff, they offered a "lower quality" output. Everything has a trade-off; Siemiatkowski realized firsthand that choosing low cost over high quality was the wrong move. AI didn’t force him to fire 700 employees; he made that choice himself. And then he, and the entire company, experienced the negative effects.


Siemiatkowski went on to say, "Some people refuse to use a company's services if they can only talk to a machine rather than a real person when help is needed. It's a lesson other firms that are also going all-in on AI may soon learn." This is a sound example of consumer demand driving meaningful shifts. Klarna’s customers didn’t want chatbots; they wanted humans. Their actions drove the company to reduce AI reliance and reinvest in human value. Consumers’ collective choice to speak out against robot customer service led to the CEO’s choice to re-employ humans.


Our experiences are made up of choices, and our individual actions can accumulate into trends that impact the future. The AI movement is no exception.


We need to manifest our intentions in our actions. What future do you want to see? Strive for congruence of your values, thoughts, beliefs, feelings, and behaviors. Consider goals and outcomes and implications. Act in strategic, meaningful ways. Build a future that is actually what we need. Identify a problem and solve it. Be vocal against things that are harmful. Spend your attention on things that matter to you and to society. Find unique ways to add value to your community.


Then we need to bring this mindset shift into our work. We control the narrative. We choose the messages we deliver, the guidelines we enforce, the cultures we nurture. We choose who to hire, what work to produce, what behaviors to celebrate. These aren’t inevitable actions. People are choosing to behave this way. We don’t have to replace humans with robots. Someone is choosing to do that. A person made that choice. If that choice aligns with your values, great. But if that choice does not align with your values, recognize that you too have a choice. You are not helpless. The power lies with you.


Your Actions Speak Volumes


Several CEOs have made the news by announcing that their employees must use AI and that they will be evaluated against their AI usage. More than that, they have laid out their goals as producing greater output in less time.


For starters, this is an example of horrible leadership communication. Many of these announcements were made in public, without prior warning to employees. This is a deplorable way to deliver a message and to launch a change initiative. Technological advancements elicit worries and fears in many people, and shifting people's mindsets--and maybe their skill sets--is a delicate balance to strike. Forcing a shift and doing it suddenly and publicly is a trifecta for disaster.


Additionally, many announcements were made on those grounds of increasing productivity without regard to employee satisfaction. Company owners and executives are making their motivations perfectly clear--they want more money in their pockets and they don't care about your wellbeing. Your growth. Your engagement levels. People are being set up for failure. But they're not being set up by AI. We have to watch our language along with our actions. They're being set up by humans who are making these decisions to prioritize robots over humans.


I'd like to also point out that the executives themselves are still in position. They haven't been replaced by robots (yet). They see the value of humans in the leadership roles they hold. Clearly humans hold value and their decisions make a major impact on the future of the company and on society. We just need to see ourselves in those powerful positions now. We just need to see ourselves as having the power to shape the future. We have the power to use tools wisely, to add unique value, and to create communities where humans matter.





Investor Chamath Palihapitiya called Klarna's move “a warning for the tech sector.” I'd like to elaborate with five on-the-nose warnings of my own.


From the employee base's perspective...


Executives that require their entire workforce to blindly use a tool, to be evaluated against it (in performance reviews), and to produce greater output in less time...


1) ...are neglecting to respect the uniqueness and variety of the roles at their company.


Some roles will not benefit from GenAI. Forcing a tool into a role for the sake of checking a box is not strategic, it's toxic leadership, and it hurts more than it helps. For example, I talked with AI authority Christopher Penn in the CMI Slack channel about a recent announcement by a CEO who planned to shift performance reviews to focus on required AI usage. We came up with a metaphor involving a blender: If you go into a restaurant's kitchen and tell everyone they must use a blender and that they're going to be reviewed based on how well they use the blender, the smoothie station may get 5 stars, but the steak chef will fail. Even if the steaks are top-grade, perfectly cooked, and delicious, the lack of use of the blender would yield a low score and maybe even cause them to get fired. This is obviously preposterous. Now think about how any tool can be applied in a similar fashion.


2) ...are micromanaging.


It is more strategic, empowering, and impactful to focus on the vision and the outcome and to allow team members to employ their preferred methods of achieving it rather than to require a particular tool regardless of its purpose or goal. Do you tell your team members they must use Google or they're fired? No. If it suits them, great. If it doesn't, also fine, as long as they're doing their jobs well. You should instead be discussing strategy and vision and goals. You should be empowering your team members to learn and grow and discern. You should be considering quality, not just quantity. And you shouldn't be forcing someone to do something your way. Operations and guidelines are necessary; forced tool usage is not.


3) ...are prioritizing the wrong things.


If you want to solely use robots instead of humans, or if you require humans to use AI to produce more output in less time, you don't care about your people, no matter how much you say you do. You're saying that money is more important than people. You're saying that you care more about lower cost and less time than you do about content quality and employee fulfillment. Plus, the data shows that  77% of workers say AI tools have decreased their productivity and increased their workload. Executives aren't empathizing with their team members. They're not considering the implications. Employees are starting to feel degraded by this emphasis of robots and artificial intelligence over human fulfilment and engagement. If you want to be successful, you need to engage your people. You need to care for your people. Prioritizing robots, money, and time is the opposite of caring for your people.


4) ...are communicating a hierarchy of value.


In addition to saying you care more about money than people, you're also saying you care more about AI than you do about other tools, processes, or skills. If one tool is required and is the baseline for your performance review but another tool isn't, the one that is required and backed by performance metrics will get more attention. By requiring AI and organizing resources toward it, you're emphasizing its importance relative to other things. As I always advise, be aware of your communication and the impact it has on others' expectations, assumptions, interpretations, and applications.


5) ...are being shortsighted.


Blind reliance on AI rather than enabling human cognitive processing hinders critical thought, decision-making, discernment, and innovation. These are uniquely human traits and skills. And we need to practice them so they don't atrophy. When you use AI to get one task done quicker today, you are cheating the people who are being made to use this shortcut. You are neglecting to consider the long-term implications of a heavy reliance on AI--decreased cognitive functioning, increased cynicism and social mistrust, less creativity and innovation, less astute decision-making and critical thinking, marred discernment and judgment. Your prioritization of ease is harming our ability to process information, grow in challenging moments, and think more strategically and creatively. Your egocentrism and shortsightedness could literally harm humanity. You do have the power to do that, yes, but you also have the power to prevent it. You have the power to embrace the analog, to nurture humanity, and to usher in a better future. This all sounds very dramatic, I know, but AI is not something to take lightly. AI is inviting a dramatic change to our lives. We need to recognize the drama and to be realistic about what's going on rather than riding the tidal wave blindly. We have the power. We just need to use our power in constructive ways.


We Need More Intentionality


We need to think beyond ourselves, beyond this task that we're facing right now. We need to pause and choose intentionality. We need to stop doing things automatically, without considering strategy or impact. We need to take breaks from the tidal waves and climb ashore to steady our feet and choose a better path forward. We need to gather our thoughts and consider context before making a decision. We need to reengage with each other, to connect socially and to empower personal and professional growth. We're not obsolete. We matter. And we need to recognize and harness our power as individuals and as a community. We're not stuck. And we're not helpless. We can shape the future into whatever we want and need it to be. Let's start now.



Tech Isn't About Tools. It's About People. And People Have More Power Than They Think.

Comments


©2020-2024 Storyhaven by Laura Goldstone.

bottom of page