
Inclusion, Bias and Building Kindlee
In this episode of FinTech’s DEI Discussions, Nadia is joined by Carla Canino, Founder, CPO and CEO of Kindlee. Together they explore the critical issue of inclusion in financial technology, looking at how bias is embedded in artificial intelligence models and why tackling underrepresentation is both urgent and transformative for the industry.
Nadia opens by framing the purpose of the series: to celebrate the wins, raise awareness of the challenges, and walk the talk for change across financial technology. She is delighted to welcome Carla to the discussion, noting how long this conversation has been in the making. Carla shares her excitement and eagerness to talk openly about her work and her personal journey.
What follows is an in-depth conversation that uncovers how exclusion can damage people’s lives, how financial models often misrepresent significant parts of the population, and why inclusion is not a “problem” to be solved but an opportunity for innovation and growth.
The Creation of Kindlee and Its Mission
Carla begins by describing the origins of Kindlee. The company was born out of the need to make a meaningful impact within the financial sector. Kindlee’s central product is what she calls real-time fairness intelligence. This involves building tools that identify and reduce bias in financial artificial intelligence systems.
She explains that Kindlee has prioritised three major use cases. The first is bias reduction in customer-facing chat interactions, which are increasingly common across banks and FinTechs. The second, on the company’s roadmap, is bias reduction in KYC systems, which determine whether individuals can pass onboarding and verification processes. The third is reducing bias in credit assessment systems, where decisions have life-changing impacts on customers.
Carla highlights that these use cases all fall within the high-risk classification of the AI Act, the European regulation that sets strict requirements for models affecting economic or financial opportunity. The Act requires third-party audits of such models, ensuring accountability. By focusing on these areas, Kindlee aims to bridge the gap between fairness and compliance, offering financial institutions solutions that represent all customers.
Carla is clear on the scale of the problem: more than 40% of the population is severely underrepresented by traditional models. Groups such as people with disabilities, older adults, and minority populations are excluded from design considerations, leading to biased outcomes. This is the gap Kindlee is determined to address.
Nadia underscores how damaging this exclusion is, pointing to the credit assessment example as one of the most urgent areas where fairness must be improved.
Carla’s Background in FinTech
Carla shares her career journey, noting that she has worked in the FinTech industry for around 15 years, primarily in the payments space. Her career has taken her through consultancy roles, positions on the acquirer side, work as a merchant, product management, and business development. This wide-ranging experience has given her what she calls a 360-degree view of decision-making in the financial sector.
She recalls being in the room when fraud prevention models and credit assessment systems were being built, while at the same time living with the experience of exclusion as a disabled person. She estimates that fewer than five percent of decision-makers in the industry have a disability, which has inspired her to pursue what she describes as inclusive product management alongside her professional career.
Carla has worked with organisations including the Federal Reserve and the W3C to educate stakeholders on how to design better products for disadvantaged groups. She argues that this not only makes systems more accurate but also creates better products overall.
The personal inspiration behind Kindlee, however, came from a very specific moment. When the pandemic began, Carla moved to the Netherlands and attempted to open an account with ING. At the time, she was using a wheelchair and was unable to physically authenticate herself in a branch. The KYC system had no mechanism to account for her needs, and she was completely blocked from accessing her account.
Carla explains that if this could happen to her, with her industry knowledge, then for immigrants, non-native speakers, and others without such insight, the barriers must be even greater. That realisation led her to build Kindlee, starting with the problem of KYC mismatch.
She describes this issue as a form of mispricing of risk: by under-representing populations, AI systems create inaccuracies that not only harm customers but also reduce the revenue and opportunities available to financial institutions.
What Inclusion Means
Nadia asks Carla one of her favourite questions: what does inclusion mean to you? Carla responds by emphasising that the answer is complex, because inclusion means different things to different people.
Her own approach is pragmatic. She argues that if products are designed with the needs of outliers in mind, then a wide variety of use cases are automatically served. She cites the example of the “curb effect”. When pavements were redesigned to allow wheelchair users to cross, the result also benefited parents with pushchairs, delivery drivers, and many others.
For Carla, this illustrates how designing for inclusion is also designing for innovation. Many of the technologies used today, such as speech-to-text tools and closed captions, were originally designed for people with disabilities. Today, they improve the lives of a much wider population.
She stresses that creating more accurate models and more inclusive products provides a business advantage. Companies benefit through reduced operational friction, fewer customer complaints, and improved customer satisfaction. Inclusion, she insists, is a virtuous cycle.
Nadia notes how important it is to stop seeing inclusion as a problem to be solved and instead recognise it as an opportunity for innovation and growth.
The Economic Evidence for Inclusion
Carla goes further, citing studies that demonstrate how inclusion drives economic transformation. Research has shown that financial inclusion fosters economic growth, lowers poverty, improves tax collection, and reduces inequality. McKinsey studies also highlight how inclusion fuels wider innovation.
She argues that society has been underestimating the economic power of inclusion as a design tool and as an innovation tool. By reframing inclusion in this way, businesses and regulators can better appreciate its value.
Systemic Bias in the Industry
When Nadia asks how the industry has been tackling systemic bias, Carla acknowledges the complexity of the challenge. She notes that some early adopters are doing a strong job of examining their data lifecycle and implementing responsible AI frameworks. She estimates that perhaps the top 15 percent of companies, often technology-forward neobanks, are making progress in designing for outliers.
However, many organisations lack visibility into whether their models are biased at all. Carla explains that bias is not necessarily a “dirty word”. In both statistics and AI design, bias is necessary to allow probabilistic systems to make decisions. The problem arises when bias leads to outcomes that disproportionately disadvantage certain populations.
She highlights how regulatory flags this year have exposed major banks for KYC over-flagging certain groups. While she finds it unfortunate that it often takes penalties to spur change, she believes regulation such as the AI Act will ultimately create greater transparency, responsibility, and accountability.
Bias, she adds, does not always appear as sensational headlines. It can look like confusion, friction, silence, or hallucinations in AI systems. For example, a chatbot that misunderstands an accent, a screening model that flags a name incorrectly, or a decision that denies access without explanation, all are subtle forms of bias. These are the issues Kindlee is focused on addressing.
Measuring the Right Outcomes
Carla repeatedly returns to the theme that “we are what we measure.” If companies do not measure the outputs of their systems, they cannot properly evaluate the impact of inclusion. Too often, businesses measure effort rather than outcome, which does not move the conversation forward.
Kindlee is responding by creating the first benchmark study to assess bias and unfairness in chatbot models. Scheduled for launch in mid-October, this study will provide a transparent view of how chatbots perform, comparing different providers and evaluating how various populations are served.
Carla stresses the importance of focusing on chatbots because they are the first line of customer interaction. Many are used not only for customer service but also for onboarding flows and product suggestions. By benchmarking how inclusive they are, Kindlee aims to highlight how biases directly affect trust, customer satisfaction, and ultimately revenue.
Empathy as the Core of Leadership and Design
Nadia closes by asking what more can be done to drive inclusion in workplaces. Carla responds by emphasising two key priorities.
First, it is essential to have the right people in the room. Without diverse perspectives, blind spots will remain unchallenged. Carla herself has been able to highlight how disabled people are disadvantaged in product design, a perspective that would otherwise have been missing.
Second, businesses must return to the importance of measuring outputs and impacts. Without metrics that show how inclusive design improves customer experience, inclusion risks being undervalued.
Carla explains that Kindlee is focused on making the cost of bias visible. Bias in systems creates friction, undermines trust, and deters customers. By showing these impacts transparently, businesses will be forced to act.
She ends with a reflection on empathy as one of the highest forms of intelligence. For her, product management is about transforming pain into scalable solutions. That process requires empathy: understanding and validating the pain points of diverse users. When teams are built with empathy at their core, they create better products and better outcomes.
Nadia agrees wholeheartedly, praising the way Carla has used her own experiences to build solutions and to demonstrate why inclusion matters at every level of financial technology.
Conclusion
This episode of FinTech’s DEI Discussions is a powerful reminder that inclusion is not just a social obligation but a business imperative. Carla Canino, through her leadership at Kindlee, shows how reducing bias in AI systems is essential for building fairer, more accurate, and more innovative financial models.
The conversation touches on exclusion in KYC processes, the need to design for outliers, the economic evidence for inclusion, the role of regulation, the importance of measuring outputs, and the central role of empathy in leadership. For FinTech, these insights demonstrate how inclusion drives innovation, customer trust, and long-term growth.
As Nadia emphasises in closing, inclusion must be reframed as an opportunity. With leaders like Carla building companies dedicated to fairness, and with financial institutions increasingly held accountable for bias, the industry has the chance to create a more equitable future.
At Harrington Starr, we understand that building fair and future-ready financial systems requires more than just innovative technology, it requires the right people driving that innovation. This episode highlights how critical inclusion and bias reduction are within FinTech. As a global FinTech recruitment firm, we work with clients across London, New York, and beyond to connect them with exceptional talent that not only has the technical expertise to shape advanced financial models but also the vision to create inclusive, ethical solutions. By championing diverse hiring and supporting organisations in finding the best professionals in data, AI, product, and financial technology, Harrington Starr helps ensure that businesses can thrive while fostering fairness, representation, and sustainable growth in the sector.