ReportWire

Tag: racial bias

  • Florida woman tries to get notary with Chase Bank. Then all the tellers laugh her out of the bank: ‘They showing up on the reviews’

    [ad_1]

    A Florida woman went to a Chase Bank location near her to have some documents notarized. But, instead of receiving professional service, she recounted an experience with employees allegedly laughing at her and refusing to assist her succinctly. 

    “ I know I’m 31 years old and I should not be crying about this because somebody talked about me, but it’s so frustrating,” said Samiya (@nextchaptersamiya) in a video with over 950,000 views on TikTok.

    Samiya described a situation where employees at the bank ogled her, continuously staring at her as she tried to receive service. When she got overwhelmed by the employees’ behavior and had to leave the bank, the employees did nothing to address their behavior. 

    The entire experience has left her wanting to go to a different bank entirely. So, why did the employees think it was “so funny” that she was somewhat tall? And are uncomfortable situations with Chase’s employees a common occurrence?

    Samiya’s experience with rude bank tellers at Chase

    When Samiya arrived at the location near her, she immediately noticed Chase’s bank tellers laughing at what she assumed was her height.

    “ I walk in and they all just start chuckling,” Samiya said. “It’s just me alone. They all just start chuckling. When I get past a certain spot, you can see the last person that’s at the window spot, and she looks up and she’s like, ‘oh, wow.’” Samiya stands at around 6 ft 1 or 6 ft 2, which she thought the tellers noticed. 

    Despite this, she continued asking for a notary. An employee immediately told her that they “could not help.”  

    “So I’m like, ‘Hey, I’m here to get a notary. And the black lady, she’s like, ‘Oh, I can’t do it. I’m doing this right now,”” Samiya recounted. Instead, a younger man told her that he could help her from afar. She walked over and received assistance from the man, but as she did so, other tellers continued to look at her and laugh. 

    Samiya continued having trouble, as she did not have her bank card with her and struggled to provide them with the phone number linked to her account. The continued laughing, leaving her overwhelmed and disoriented. 

    Samiya ended up leaving without getting her documents notarized. She tried to file a complaint while there, but she felt too disoriented by the employee’s behavior and therefore just left empty-handed. 

    Later, she drove away and found a spot to cry, describing the situation as unkind. “ I’m here to get a notary,” Samiya said. “Do your job, give me a notary. You don’t have to be laughing back and forth to your friends. I know I’m tall. Whatever whoop-dee freaking do, but do your job.” 

    Commenters share their concerns with the encounter 

    Many people echoed that the exchange was rude and completely unprofessional. “People nowadays are becoming more and more rude. This is so sad. This world is becoming so inhumane,” said one viewer. 

    Overall, many people told Samiya that she wasn’t being “sensitive” or “emotional” about her experience with Chase’s bank tellers. She was experiencing a very real human response to uncouth behavior that should have been better. 

    In her comments section, Samiya said, “Ultimately, there’s a time and place for everything. When a customer walks up, professionalism should always come first and people should feel comfortable and respected. Thank you so much to everyone who showed support, but there’s no need to leave any more bad reviews or call the company. I reached out, and they’ll be reaching out to the branch manager to speak with those involved, so we’re good. I hope everyone has a great holiday, enjoys time with family and friends, and God bless.” 

    An update from Samiya

    According to Samiya, a wave of support came in after thousands saw her TikTok. Many commenters chose to leave a Google review with the Chase bank that she went to, expressing dissatisfaction with the way she was treated. 

    She later posted these Google Reviews to her page and stated she was happy that many people supported her, as the interaction was unnecessary and hurtful.

    “I’ve emailed the company directly to address what happened, and we’ll see how they choose to respond,” Samiya said. “I truly appreciate everyone who took the time to support me and leave a review. Thankfully, I was able to get the notary done elsewhere at no cost, and my day ended on a much better note. Thank you all for the kindness.”

    The bank has deleted many of these Google Reviews and not made any definitive statements regarding the encounter. They have, however, processed Samiya’s complaint and let her know that the situation was being “handled.” 

    @nextchaptersamiya I’m not usually this sensitive or emotional. But I’m in a very tender season of my life, and being laughed at when I needed professionalism and empathy pushed me over the edge. What happened at Chase Bank today was unnecessary and hurtful. You never know what someone is carrying kindness goes a long way do better! #chasebank #badexperience #storytime #horrible ♬ original sound – nextchaptersamiya

    We’ve reached out to Chase Bank’s corporate email and Samiya via TikTok direct message for comment.  

    Have a tip we should know? [email protected]

    Image of Rachel Thomas

    Rachel Thomas

    Rachel Joy Thomas is a music journalist, freelance writer, and hopeful author who resides in Los Angeles, CA. You can email her at [email protected].

    [ad_2]

    Rachel Thomas

    Source link

  • Marcellus Williams’ Death: Political Execution of a Black Man Carried Out by the Supreme Court

    Marcellus Williams’ Death: Political Execution of a Black Man Carried Out by the Supreme Court

    [ad_1]

    In a damning display of justice gone wrong, Marcellus Williams, a Missouri death row inmate, was executed, despite overwhelming evidence suggesting his innocence. His death by lethal injection has sparked outrage, with the blame falling squarely on the shoulders of former President Donald Trump, Senate Minority Leader Mitch McConnell, Missouri Governor Mike Parson, and the conservative U.S. Supreme Court justices who refused to halt the execution.

    Williams, 55, was convicted in 2001 for the 1998 murder of Felicia Gayle in her St. Louis apartment. However, no DNA evidence ever tied him to the crime. The St. Louis County Prosecuting Attorney’s Office, which urged a stay of execution, had supported his legal team in its tenacious fight for clemency. The victim’s own family had requested Williams’ sentence be commuted to life without parole, writing, “Marcellus’ execution is not necessary.”

    Yet, the conservative majority on the Supreme Court—Chief Justice John Roberts, Neil Gorsuch, Clarence Thomas, Samuel Alito, Brett Kavanaugh, and Amy Coney Barrett—voted to deny Williams a stay. Their decision condemned an innocent man to death, and it is a stark reminder of how deeply broken the justice system has become under their influence. Liberal justices Sonia Sotomayor, Elena Kagan, and Ketanji Brown Jackson dissented, recognizing the glaring miscarriage of justice.

    This execution didn’t happen in a vacuum. It is a direct result of the political power play that Trump and McConnell orchestrated. Trump’s appointment of three ultra-conservative justices—Gorsuch, Kavanaugh, and Barrett—solidified a Supreme Court more interested in ideology than fairness. McConnell’s refusal to consider Barack Obama’s 2016 nominee, Merrick Garland, to replace Justice Antonin Scalia was a pivotal move in ensuring this conservative stronghold. He later rushed through Amy Coney Barrett’s confirmation weeks before Trump’s election loss, fully aware of the long-term consequences.

    Gov. Mike Parson, a staunch MAGA Republican, ignored every plea for mercy, including those from the prosecutor’s office and over a million citizens and faith leaders who called for clemency. Despite abundant evidence of Williams’ innocence, Parson’s decision to carry out the execution was viewed by many as cruel and motivated by bloodlust.

    “This was a lynching. Make no mistake, this was state-sanctioned murder of an innocent Black man,” NAACP President Derrick Johnson declared. “Governor Parson had the responsibility to save a life, and he didn’t. When DNA evidence exonerates a man, capital punishment is not justice—it is murder. Trump, McConnell, and the conservative Supreme Court justices now have blood on their hands.”

    Johnson added that Williams’ final moments were a tragic reminder of the human cost of this injustice. Reportedly, Williams lay conversing with a spiritual advisor as the lethal injection took effect. His chest heaved a few times before he went still, as his son and two attorneys watched helplessly from another room. No one from Gayle’s family was present to witness the execution—likely because they had asked for his life to be spared.

    Cori Bush, Missouri’s Democratic Representative and staunch opponent of the death penalty, minced no words in condemning Parson’s role. “Governor Parson didn’t just end Marcellus Williams’ life—he demonstrated how the death penalty is wielded without any regard for innocence, compassion, equity, or humanity,” Bush stated. “He ignored the facts, the evidence, and the pleas from all sides. The so-called ‘beyond a reasonable doubt’ standard was tossed out, because Marcellus was a Black man in a system rigged against him.”

    Many also said the hypocrisy of the so-called “pro-life” conservatives was laid bare. A U.S. Army veteran and activist, Charlotte Clymer blasted the justices responsible, saying, “These people don’t care about life. They only care about control.”

    Williams’ case, much like so many others involving Black men and the death penalty, exposed the deep racial bias embedded in America’s legal system. His attorneys had raised significant concerns about racial discrimination during jury selection, and the lack of credible evidence—especially DNA that didn’t match Williams—only underscored the injustice of his conviction. Yet, the political machinery of Trump, McConnell, Parson, and the Supreme Court moved forward without pause, ensuring his death.

    As Bush and others stated, Williams’s death wasn’t just an issue of a broken justice system—this was a political execution. Like Parson, the U.S. Supreme Court chose to ignore the evidence, the pleas, and the humanity of Williams. A litany of social media users posted comments demanding that Williams’ blood is on the hands of Republicans, and the country must reckon with the brutal truth that our highest court, and the leaders who enable it, can no longer be trusted to protect the innocent.

    Williams’ execution, despite overwhelming evidence of his innocence, is a searing indictment of a broken system where political power and racial bias outweigh truth and justice, Bush noted. ‘This was not just an execution,” she railed. “This was a state-sponsored lynching, and every person responsible for it must be held accountable.’”

    [ad_2]

    Stacy M. Brown, NNPA Newswire Senior National Correspondent

    Source link

  • Banks must address bias in large language models

    Banks must address bias in large language models

    [ad_1]

    Sometimes the bias can be easy to identify and easily fixed. For example, the large training text might include toxic or hateful dialogue, in which case that text is identified and removed, write Zor Gorelov and Pablo Duboue, of Kasisto.

    sdecoret/sdecoret – stock.adobe.com

    In the rapidly evolving landscape of artificial intelligence for banking, the past 18 months have produced a fascinating evolution in the technology, the players and overall industry perception. 

    Even with its ambitious vision to transform the banking industry and its noteworthy early successes, generative AI has one well-known drawback: implicit bias, which poses a risk if unaddressed. For example, on Feb. 26, Google’s parent company, Alphabet, saw its market capitalization drop by the equivalent of Disney’s total net worth after its Gemini product was widely criticized for its issues with bias. 

    Is AI bias worth addressing? Is it worth addressing in banking? Absolutely. But what exactly is the problem and how does it get fixed? 

    Let’s begin by discussing the expectations of relevancy and freshness of the training data, particularly in the context of written content. By its very nature, once a word has been laid down to paper (or to electronic format) it is already an expression of the past. 

    Even if it was only written a week ago, it is now weekold news. This fundamental principle of relevancy and freshness in human communication particularly affects large language models, the brains behind generative AI. The training data that LLMs require combines large amounts of internet text from various time periods. 

    This text reflects different societal positions on various topics and is written in the language of those times. We can then say the LLM exhibits “bias” as a way of simplifying the problem. All cultures have explicit and implicit cultural biases. We notice the text is inappropriate because its bias is out of touch with our current societal perceptions, meaning LLMs are by definition being trained on outdated information. 

    Sometimes the bias can be easy to identify and easily fixed. For example, the large training text might include toxic or hateful dialogue, in which case that text is identified and removed. 

    For wide adoption of LLMs in banking, removing these biases is not only needed but also legally required. Producing customer communications with a gender or racial bias will clearly find pushback from customers and regulators. Most of the training data employed in LLMs is from the 1990s and 2000s when the culture of sharing text freely on the Internet was commonplace. Nowadays, more content is in images and video or behind paywalls. 

    Fast forward to 2024, our current society has significantly changed its views in many of these areas. Thus, at the very least, a tight human and regulatory oversight for these types of sensitivities is recommended. 

    Furthermore, cultural bias can be difficult to perceive for individuals immersed in a given culture. It is part of the “operating system” of the society. There are a number of recent technical advances that enable adjusting the LLM bias to conform to current times. It all starts by identifying the existing biases in the system and then using humans to indicate which variations in a text are to be preferred. This is the method used by OpenAI’s ChatGPT as well as other leading LLMs to add guardrails to overcome some of the existing bias. This process is very expensive in terms of both personnel and computer time. 

    In the world of banking, this process needs to be enhanced to prevent LLMs from being used for blatantly illegal activities, such as impersonation, to obtain a loan. Implementing guardrails is an approximation, and the process should be carefully managed as it is prone to overcorrection. This issue contributing to Alphabet’s value loss mentioned above was about their new product, Gemini, overcorrecting to the point of generating historically inaccurate iconography of the U.S. Founding Fathers. 

    Addressing implicit bias must start at the source. There is a growing understanding in the world of generative AI that the companies that train and build their LLMs on high-quality human-curated data and text, rather than large amounts of random data and text, will provide the most value to their customers. 

    In financial services, it is imperative to partner with vendors that use high-quality, banking-specific data sources to help mitigate the risk of implicit bias in the AI systems being developed. 

    Addressing biases necessitates a shift toward custom LLMs that are tailored for industry specific needs. An LLM that is built for banking offers the same experiences and features as the larger, general purpose LLM while also meeting the banking industry’s requirements for accuracy, transparency, trust and customization. 

    These models are not only more cost-effective to create and operate, but they also provide better performance compared to general-purpose LLMs. Moreover, as generative AI has evolved toward multimodal capabilities, integrating text, image and other data modalities, banks will be able to leverage this capability to analyze diverse types of information and deliver more comprehensive insights.

    [ad_2]

    Zor Gorelov

    Source link

  • The racial wealth gap is getting wider. Can technology fix it?

    The racial wealth gap is getting wider. Can technology fix it?

    [ad_1]

    Four years ago, George Floyd was choked to death by a police officer after trying to use a possibly counterfeit $20 bill at a Minneapolis convenience store. Widespread outrage about the killing spurred the largest U.S. banks to vow to do their part to fix the inequalities in the American financial system. 

    JPMorgan Chase announced it would spend $30 billion to address social and economic inequities. Bank of America and Citi each pledged $1 billion. Wells Fargo promised $450 million, U.S. Bank $116 million.

    Today, the banks say they’ve put this money to good use.

    JPMorgan Chase says it’s invested $30.7 billion in racial equity initiatives, mostly in the preservation and construction of affordable housing. Citi says it has provided growth capital and technical assistance to minority depository institutions, invested in Black-owned businesses and affordable housing and is working to become an antiracist institution

    Wells Fargo has committed $150 million to a special purpose credit program. Bank of America says it’s committed $1.2 billion to advance economic opportunity, focusing on jobs, affordable housing, small businesses and health equity. U.S. Bank says it has stepped up lending to minority owned small businesses and mortgage down payment assistance in underserved communities. 

    Despite the tens of billions of dollars banks have spent, the racial wealth gap has actually widened over this time period.

    According to the Federal Reserve Board’s most recent report on racial inequality, median wealth among white families was $285,000 in 2022, compared with $44,900 for Black families. That’s a difference of about $241,000. In 2019, the difference was roughly $191,000. For Hispanic families, the median wealth totaled $61,600 in 2022. That means the wealth gap between Hispanic and white families totaled $224,000, up from roughly $177,000 just three years earlier. 

    And while 72.7% of white Americans own their own home, only 44% of Black Americans do, according to the National Association of Realtors. Among Hispanic families, the home ownership rate is 50.6%; among Asian families, it’s 62.8%. Black people account for only 4.3% of the 22.2 million business owners in the U.S. 

    “The reality is, white America and people of color America are living in two different financial realities,” said Silvio Tavares, CEO of VantageScore. “And as Americans, we know that that’s not sustainable. Putting aside the moral aspects of it, just as a business proposition, that’s just not sustainable.”

    aaronaction2.jpg

    “Wealth affects two important things on the household level. It affects education and the environment that you’re in. Without being able to improve those, you have this continuous cycle,” said Aaron Long, head of client advisory and strategy at Zest AI.

    Impact of the racial wealth gap

    Aaron Long grew up in the 1980s in St. Louis.

    “In the inner cities, you had the drugs, the crack, all of that stuff,” said Long, who is head of client advisory and strategy at Zest AI, a technology company with an AI-based lending platform. “Wealth affects two important things on the household level. It affects education and the environment that you’re in. Without being able to improve those, you have this continuous cycle.”

    People will sometimes blithely say that kids born in disadvantaged neighborhoods just have to pull themselves up by their bootstraps, work hard and overcome their circumstances. But Long says this cliche is not a realistic prescription to improve the lives of children growing up in poverty.

    “It’s super tough to get out,” he said. “You don’t have the skills to do it. You don’t have the education to do it. You don’t know where to go to do it.”

    Kids who grow up in poor inner cities have “small dreams,” Long said, “because that’s the only thing that you know how to dream about — you don’t see anyone in your family that you can pick up the phone and say, ‘How do I start a business?’”

    And it’s been this way in the United States for decades. In the mid-1960s, the average Black household was making around 57 cents per dollar compared with the average white household, according to Long. Today it’s around 62 cents.

    “You can see over the generations that the wealth gap is still there,” Long said. “If we continue with that trajectory, it’ll be well over 500 years before we’re able to have no wealth gap at all.”

    Racism and systemic issues still prevent African Americans from getting approved for credit, said Tonita Webb, CEO of Verity Credit Union in Seattle. 

    “It is so traumatizing for some to even just walk into a bank to apply, because of their past experience,” she said. “I know people who won’t do it because they think the financial services industry is not for them because of all the nos that they have received.”

    Some of those nos may have been for sound creditworthiness reasons, she said, but banks frequently also don’t take any steps to help move these applicants forward. Others are rejected “just because that’s been the history of our financial services industry,” Webb said.

    A long history

    Wole Coaxum left his job at JPMorgan Chase and started a fintech called Mocafi after Michael Brown, an 18-year-old Black man, was shot and killed by a police officer in Ferguson, Missouri, in 2014. A grand jury subsequently declined to indict the officer, and a firestorm of protests followed. Mocafi works with governments and nonprofits to provide financial services to underserved consumers. 

    “Watching the folks in Ferguson in the streets protesting, for me, was an instance of people fighting for social justice, but also a need for economic justice and a lack of access to opportunity,” said Coaxum. Their lack of resources was part of the reason they were in the streets, he thought. 

    In Coaxum’s view, the racial wealth gap “is deeply rooted in the bones of this country, and I’m reminded of it regularly.” 

    For instance, President Franklin Delano Roosevelt’s G.I. Bill was designed to help World War II veterans obtain affordable mortgages guaranteed by the Veterans Administration. But the loans were made by white-run financial institutions that rarely provided mortgages to Black people.

    As a result, the vast majority of the benefits went to white service members. In one example, “fewer than 100 of the 67,000 mortgages insured by the GI Bill supported home purchases by non-whites” in the New York and northern New Jersey suburbs, historian Ira Katznelson wrote in the book “When Affirmative Action Was White: An Untold History of Racial Inequality in Twentieth-Century America.”

    “The biggest economic driver of the 20th century that enabled us to become a superpower post World War II excluded Black people,” Coaxum said. “From a historical lens to a modern lens, there is a consistent thread of Black folks having less access to wealth building opportunities,” Coaxum said.

    What it would take to shrink the racial wealth gap

    The racial wealth gap is a huge, multifaceted problem with experts disagreeing over how to best close it. Some consider increased home ownership the answer, because of all the socioeconomic benefits that stem from that. Others focus on improvements in wages, basic income, increased savings or short-term loans that people can turn to in a pinch and, say, get new tires for their car so they can keep going to work. Others still think artificial intelligence will help. Many believe it will take a concerted effort by the banking industry, fintechs and government.

    “It is a question I grapple with all the time,” Webb said. “And here’s where I land. We can make a difference for our small community and our small membership. But I think to make a difference for the overall wealth gap, the financial services industry has to make a decision to provide programs to undo systemic practices and policies and use technology, such as AI, that looks at other things besides the credit score, which we know is systemically created to have an advantage for some and a disadvantage for others.”

    Financial services firms could provide education to help people understand the financial system and how to navigate it, she said. And products need to be developed for the purpose of shrinking the wealth gap. 

    If more than 70% of white people own homes and only 40-plus percent of Black people do, “there has to be something specifically done to close that gap,” Webb said.

    It’s not enough for the government to put out a policy that companies can no longer discriminate, Webb said. There are already laws, including the Fair Housing Act of 1968 and the Equal Credit Opportunity Act, that prohibit lending discrimination based on race — and yet these issues persist. 

    “We’ve had decades and years of discrimination,” she said. “We also have to create programs that give access where folks didn’t have access before in order to shrink that gap. We’ve got to remember there are underserved communities that are way behind, so they’re playing the catch-up game.”

    Coaxum sees the racial wealth gap as a market failure that would be best solved in partnership with the government. Banks are driven to target more affluent — and in general, white — customers. These consumers tend to have more assets that the banks hope to help them invest. Originating one larger mortgage for a more expensive home is seen as less of a hassle than making several smaller loans for more modest houses. Credit decisions tend to be easier, and lenders feel more assured they will be paid back. 

    “If left to the private sector, it’s going to come along in a drive towards efficiency that doesn’t necessarily have a wide net that is systematic, sustainable and strong enough to close the wealth gap in our communities,” Coaxum said.

    Until local, state or federal government does something, “we’re just going to have a series of really smart people building really interesting companies, but may not have the scale that’s required to really meaningfully shift the needle,” Coaxum said. 

    One thing governments could do is rethink how they get resources to the unbanked and underbanked of their communities and work with partners to do this digitally, rather than through checks and benefits cards, Coaxum said.

    Coaxum’s fintech, Mocafi, for instance, works with New York City to provide immigrants with debit cards they can use to receive help. 

    New migrants to New York are processed at the Roosevelt Hotel in Manhattan. They used to receive food deliveries every three days but this inevitably meant that uneaten food was thrown out, making the effort expensive and wasteful. With Mocafi, the city is testing giving immigrants a preloaded debit card so that they can buy their own food. According to Coaxum, this new system is a third of the cost of having food delivered and gives participants more choice in what they eat. It also puts dollars into the community and reduces waste, he said. 

    The credit gap

    Tavares’ family came to the United States from Angola when he was 10 years old. His mother was a physician and his father was a politician turned professor. His parents found a house they liked in a safe neighborhood with good public schools. His father went to the local savings bank to apply for a mortgage.  

    “He fully anticipated that he would be approved because he had a Ph.D.,” Tavares, VantageScore’s CEO, recalled. “He was a professor at a prestigious university, and he had money in the bank.”

    The application came back a couple weeks later: Denied. When his father walked into the bank branch to ask why, he was told it was because he was an immigrant and didn’t have a credit report. Tavares’ parents talked about this a lot at the kitchen table.

    “I was just starting to learn English, but I kept on hearing this weird word, ‘mortgage,’” Tavares said.

    It’s degrading and discouraging to be declined for credit the way his family was, Tavares said.

    “When you say to somebody, you are not creditworthy, what they often focus on is not the credit part, but the banker saying, ‘You are not worthy,’” he said. 

    That stigma is part of the reason why African Americans and Hispanics often are suspicious of the banking system, “because they have a relative or somebody that they know who was very hardworking, very focused on savings, but then when they applied, they got denied,” Tavares said.

    In Tavares’ case, his father decided to use the family’s entire savings to buy the house, against his mother’s objections that if any one of them got sick, the family would be ruined. His father said the family would build a credit report over three or four years, refinance and get the money back.

    “They were able to do that, and that’s what paid for my engineering degree, my MBA and my law degree,” Tavares said.

    Starting in the fourth quarter, the Federal Housing Finance Agency will require lenders to use VantageScore 4.0 scoring models in order to sell mortgages to Fannie Mae and Freddie Mac. VantageScore 4.0 uses machine learning and trended credit data to assess the creditworthiness of people who have limited credit history. Trended data shows a person’s pattern of financial behavior over a set period of time, generally about 24 months. Tavares estimates that this will enable 4.9 million new borrowers to become eligible for a mortgage and 2.7 million will be able to easily get a new mortgage because their credit score will be above 620. 

    Everyone who is creditworthy should have access to a mortgage, which is the key to unlocking financial stability, Tavares said.

    Former Minneapolis Police Officer Derek Chauvin Trial Begins
    Demonstrators hold up images of George Floyd during a protest in 2021. Floyd was choked to death by a police officer after trying to use a possibly counterfeit $20 bill. His death spurred large U.S. banks to pledge funds to help fix the inequities in the U.S. financial system.

    Christian Monterrosa/Bloomberg

    “If you own a home, all sorts of great things flow from that: better access to public schools, a financial security cushion when times get rough, because you can dip into your home equity,” Tavares said. “Eventually when kids finish public high school, they can go on to college and you can tap your home equity to finance that.”

    Besides mortgages, access to other types of credit, such as an auto loan, can make a significant difference in closing the racial wealth gap, experts said.  

    “Being able to access a car directly translates into better opportunities to tap new work opportunities,” Tavares said. “It gives you the ability to find the best job in your area, the one that pays the highest wages, and that translates directly into increased wealth and closing that racial wealth gap.”

    Solo Funds, a Los Angeles fintech that hosts a platform on which people in disadvantaged communities make small loans to one another, is closing the racial wealth gap for its members, according to co-founder Rodney Williams. 

    Solo Funds’ borrowers have saved nearly $30 million in fees they would have paid had they used a credit card, Williams said. And people who lend on the platform are seeing their money grow for the first time in their lives, he said.

    Solo doesn’t have the budget to do much marketing, he said. 

    “But if you go into the inner city community, if you go to the barber shop and you have a flat tire, someone’s going to say, use Solo,” Williams said. “That’s just the word on the street.”

    The need for alternative data

    Some blame the banking industry’s reliance on the FICO score and traditional credit history data for the persistence of the racial wealth gap.

    “There’s not enough data in the traditional credit bureau system to give lenders confidence about how to lend to segments that are not well represented in the credit bureau file,” said Misha Esipov, founder and CEO of Nova Credit. “To better serve those segments, you need to have a platform which includes the infrastructure, the analytics and the compliance to better understand those segments.” 

    Nova Credit’s platform provides credit bureau data (including from other countries), bank account transaction data and rent payment history as well as analytics and income verification.

    “Our belief is that when you have more data and more visibility, you can responsibly serve these segments that the traditional credit bureau model just doesn’t quite capture,” Esipov said. 

    One in five Americans have no credit score because they don’t have enough credit history to be scored, said Brian Hughes, former chief risk officer at Discover.

    Yet 95% of American adults have a checking account, “which is a great source of data and payroll data,” Hughes said. “There’s light that can be brought to these customers that don’t have a credit score. And once it’s brought, then adoption can happen and if adoption happens, greater inclusion happens,” he said. 

    Webb at Verity Credit Union agrees the FICO score is not sufficient to determine creditworthiness. FICO scores are calculated using data in credit reports that is grouped into five categories: payment history, amounts owed, length of credit history, new credit and credit mix. (FICO also offers UltraFICO, a model through which consumers opt to have a bank incorporate an analysis of their bank account data into their score. VantageScore offers a similar product, VantageScore 4plus.)  

    “A FICO score really only looks at five or six different pieces of data,” Webb said. “There’s lots of other ways that we can get more information about somebody’s character. Someone shouldn’t have to pay for the rest of their lives for maybe a blip in their lives.”

    For instance, a consumer could get a cancer diagnosis that impacts their ability to work for a time, she said. 

    “That is life and that is part of credit,” Webb said. “You can’t make somebody pay for this for 10 years. The situation can improve and no longer be a mitigating factor to how they’re going to pay their bills moving forward.”

    Banks’ and credit unions’ efforts to use alternative data, such as checking account data, to inform lending decisions is a step in the right direction, in Coaxum’s view. 

    “But you can’t forget that check cashers and pawn shops and payday lenders are serving this customer, and those data elements are not in the algorithms,” he pointed out.

    If algorithms had data from these sources, banks would have “a pretty good shot at maybe reimagining lending for this population,” Coaxum said. “That dataset would allow you to come up with some more interesting and creative lending solutions that you could feed the algorithms that might open the market up.”

    While check cashers and pawn shops don’t report repayments of loans to credit bureaus, they do sometimes report when people don’t repay, creating a double negative for people who don’t have access to bank branches. The same is typically true for rent payments — the landlords that do report to credit bureaus tend to only report missed payments, not payments.

    Some see hope in a movement to get landlords to report tenants’ rent payment to the credit bureaus. This could give people who can’t afford to purchase a home a way to build a credit history and work toward possibly obtaining a mortgage. 

    Esusu, for example, facilitates the reporting of on-time rent payments to credit agencies. It partners with government-sponsored housing enterprises like Fannie Mae and Freddie Mac.

    The company says it has unlocked billions of dollars in credit and facilitated access to loans, mortgages and student loans for individuals who were previously underserved.

    “The tangible increase in credit scores among renters and the creation of new credit tradelines demonstrate progress in bridging the racial wealth gap by providing financial opportunities to those who were previously credit invisible,” said Samir Goel, co-founder and co-CEO of Esusu. 

    AI-based lending

    Some bank and fintech leaders think AI could help close the racial wealth gap. 

    “We are in the early stages of assessing the transformative power of AI,” said Carolina Jannicelli, head of community impact at JPMorgan Chase. “We do believe that advancements in technology, as has been the case throughout history, have the potential to advance our economy and positively impact communities.”

    Since Verity Credit Union began using Zest AI in lending decisions last year, it has seen a significant increase in the number of approvals for protected status applicants, including a 271% rise for individuals aged 62 and older, a 177% increase for African Americans and a 375% uptick for Asian Americans and Pacific Islanders. Approvals for women increased by 194% and by 158% for Hispanic borrowers. 

    The $809 million-asset credit union tries not to decline people without helping them get to a yes, Webb said.

    “Not everyone has been told how to navigate finances,” Webb said. “We also understand, especially for traditionally underserved individuals, there’s a lot of trauma around finances. So dealing with those issues that may be present for folks helps get them in the position of a yes for some of the loans.”

    The credit union is using Zest AI software to make unsecured auto loans, credit cards and personal loans. It meets quarterly with Zest’s data analytics team to review data on the results. 

    Tia Narron, chief lending officer at Verity Credit Union, considers a borrower’s current ability to repay the loan a much stronger indicator than if the person’s credit history indicates a brief past financial challenge.

    The company hopes to use this technology beyond lending, for things like preapprovals and account opening.

    Verity's Webb.jpg

    “It is so traumatizing for some to even just walk into a bank to apply, because of their past experience,” said Tonita Webb, CEO of Verity Credit Union in Seattle. “I know people who won’t do it because they think the financial services industry is not for them because of all the nos that they have received.”

    AI’s unintended consequences

    As the many recent examples of inaccuracies, hallucinations and bias in generative AI models show, AI is obviously not a cure-all.

    “I believe that technology is an accelerant, not necessarily a problem solver,” Coaxum said. “It could make the problem worse if we’re not careful.”

    The use of AI to make decisions doesn’t equate to treating people equally, Coaxum said, because AI models are dependent on the datasets they are fed. And where banks aren’t serving minority communities, or aren’t serving them much, they lack the necessary data.

    According to the Federal Reserve Bank of Philadelphia, since the onset of the COVID-19 pandemic, the total number of U.S. bank branches has declined by 5.6%. The number of so-called banking deserts — neighborhoods where no banks have a physical presence — has increased by 217, and the population living in banking deserts has increased by more than 760,000 people. 

    A consequence of under-serving minority communities is that when banks are building datasets to inform the algorithms they use for lending decisions, they don’t have a large enough data sample to be able to really understand payment behaviors of these customer bases. 

    “It becomes, in my mind, challenging to have a robust lending framework,” Coaxum said. “Not because they’re not good people, not because they don’t want to, they just don’t have the customer base.”

    There is a chance AI could perpetuate discrimination, resulting in further unequal treatment of racial minorities, Goel said.

    “To mitigate the risk of worsening the racial wealth gap, we have to ensure that AI systems are ethically developed, regularly audited for biases, and are regulated to prioritize fairness and inclusivity in financial services,” he said.

    AI systems used in commercial settings are typically trained on past human-generated data, pointed out Daniel Susskind, economics professor at King’s College London, senior research associate at the Institute for Ethics in AI at Oxford University and author of the book “Growth: A History and a Reckoning.” 

    “So a system that determines who gets a job interview is in part trained on the sorts of decisions that human interviewers have made in the past,” Susskind said. “The great risk, and we see this in practice, is that the sorts of biases that people exhibit in human decision making simply get replicated and in some cases magnified by these systems, which are learning how to act from human experience.”

    When AI models do demonstrate biases after being trained on human data, “quite often they tell us interesting and uncomfortable things about ourselves,” Susskind said. “They hold a mirror up sometimes to our own biases, some of which we didn’t know that we had.”

    In a paper entitled, “What’s in a Name? Auditing Large Language Models for Race and Gender Bias,” Stanford law school graduate student Amit Haim, research fellow Alejandro Salinas de Leon and Prof. Julian Nyarko, who is also associate director of the Stanford Institute for Human-Centered Artificial Intelligence, tried asking ChatGPT and other large language models for help in several scenarios, such as buying a car or a bicycle, using different names. Names commonly associated with white men, such as Dustin, Hunter and Jake, produced the best results. Names associated with Black women, such as Keyana, Lakisha and Latonya, received the least advantageous outcomes. 

    “Models are trained on historically biased data,” said Salinas de Leon. “So when you put bias in, you will get bias out on the other side. If we continue on this path without properly reviewing the models and the training data they are given, then we’ll definitely increase the gap because we’re unaware of all the biases that they were trained on.”

    On the other hand, algorithms have less intentional bias than humans, Nyarko pointed out. 

    “Algorithms don’t have animus,” he said. “In the law, we care a lot about, do you have discriminatory intent? When algorithms make decisions, they don’t have the intent to hurt minorities. They might do that as a byproduct, but for humans, there can be specific intent or subconscious biases.” 

    According to Laura Kornhauser, founder and CEO of Stratyfy, transparency is key for a fintech providing AI-based underwriting and fairness models. Many models are tested after they’ve made decisions, which can make it hard to revise the models, she said.

    “That ends up being really essential in this bias question,” Kornhauser said. “If I’m just feeding the data we have into a machine, even if I’m doing some smart things around dual optimization and adversarial biasing, if I can’t see inside the guts of the machine and make changes to how it’s working, then the risk of that bias that exists in the data being propagated forward is very real and very meaningful.” 

    Stratyfy is working with Underwriting for Racial Justice on a pilot with several lenders to drive greater fairness and access within BIPOC communities. 

    “That ends up being such a hard piece of really moving that racial wealth gap as it relates to availability of fairly priced credit,” Kornhauser said. “So many lenders are so set in the way they’ve done things before.” 

    Part of a broader issue

    The racial wealth gap is part of an overall wealth gap in America. According to Advisorpedia, more than 70% of wealth in America is owned by 10% of families. The gap between the haves and the have nots isn’t new, but it has been growing. 

    “When you look at 74% of Americans, according to our Inside the Wallet report, living paycheck to paycheck, you realize very quickly that it’s just everybody you know,” said Michael Woodhead, chief commercial officer of FinFit. 

    “Despite the best efforts of organizations like ours that are focused on financial wellness solutions and services, this problem’s only gotten worse, and it was exacerbated by macroeconomics that came out of the pandemic,” Woodhead said.

    In his view, the financial services industry in this country has always been set up to serve people who have extra money at the end of the month, and they take that extra money and help them make it more money.

    “As a result, if you don’t have extra money at the end of the month, the financial services industry really doesn’t have much to offer you,” Woodhead said. 

    The way most Americans who are living paycheck to paycheck solve problems of lack of liquidity is with debt services that they can’t afford, which creates even more problems, Woodhead said.

    “But financially healthy people, even if they don’t have savings to speak of, have access to affordable credit,” Woodhead said. 

    FinFit works with employers to provide financial services to individuals who are underserved by the marketplace today, he said. It offers access to credit for emergencies or for long-term debt consolidation, with interest rates of 7.9% to 24.9%. Applicants don’t need to have a FICO or VantageScore score, and instead, FinFit relies on a machine learning algorithm to price its loans.

    The most important thing FinFit offers is an emergency savings solution, Woodhead said. “So the next time I have a financial emergency, I have an option: I could use credit, or I could use my own emergency savings account that I have built up over time,” Woodhead said. 

    The traditional financial services industry has been paternalistic in telling people they’re spending too much money — if they would just spend less than they make, they wouldn’t have these problems, Woodhead said. 

    “That’s the way we have tried as an industry to solve this problem for about 30 years: by shaking a finger at people,” Woodhead said.

    The cost of doing nothing

    Banks that don’t try to address the racial wealth gap face an existential threat, Tavares said.

    “The demographics of our society are changing and technology has to keep pace in order for the lending system to continue to be resilient, growing, fair and free from risk,” Tavares said. “What people don’t often think about is there’s a significant cost to not updating and innovating the technology for lending.”

    Some lenders hold that what worked 30 years ago or 20 years ago is tried and true and will continue to work today.

    “There’s actually a risk for that because in the America that we have today, the borrowers are not the same as 30 years ago,” Tavares said. “And yet you’re using this old, outdated technology, so there’s a risk also of not innovating.”

    Many banks are making the decision to include more updated and inclusive technology because it’s a business imperative in a country that’s rapidly becoming majority-minority demographically, he said.

    “If you look at a state like California, 58% of the population is Asian American, Hispanic American, and African American,” Tavares said. “If you can’t lend effectively to those people because you have outdated technology, that’s a business problem, that’s a profitability bottom line problem,” he said.  

    [ad_2]

    Penny Crosman

    Source link

  • Racist emails becoming focal point in redlining settlements

    Racist emails becoming focal point in redlining settlements

    [ad_1]

    The Justice Department has secured $122 million from a dozen banks and mortgage companies in redlining cases since Attorney General Merrick Garland announced the agency’s Combat Redlining Initiative in 2021.

    Haiyun Jiang/Bloomberg

    When the Justice Department alleged last year that American Bank of Oklahoma had engaged in redlining, emails containing racial slurs became a focal point of the allegations. One bank executive forwarded an email that proclaimed “Proud to be White!” and used the “N word” in its entirety and other racial slurs.

    In another separate redlining case against Trident Mortgage, the Justice Department described how loan officers, assistants and other employees received and distributed emails containing racial slurs and content that used racial tropes and terms. The communications sent on work emails included a photo showing a senior loan officer posing with colleagues in front of a Confederate flag, and pejorative content related to real estate and appraisals and content targeting people living in majority-minority neighborhoods. Trident, which is owned by Warren Buffet’s Berkshire Hathaway, settled the DOJ’s complaint in 2022 for $24 million.

    Since the Justice Department launched its Combatting Redlining Initiative in late 2021, racist emails have received more attention from both the DOJ and the Consumer Financial Protection Bureau in an effort to show racial bias has permeated a company’s culture.

    Discriminatory emails on their own have not been used to allege redlining. Rather they are combined with key lending statistics that show how lenders compare with their peers in making loans in minority communities and whether a lender has avoided locating branches or hiring loan officers in minority communities. All of that, taken together, is then used to show intentional discrimination.

    In some cases the emails help regulators differentiate among lenders that are not providing equal access to credit.

    Banks rarely push back against redlining claims and typically choose to settle such cases, often citing the cost and distraction of protracted litigation as the reasons for reaching an agreement with authorities. But some legal experts say that financial institutions have little control if a racist email is sent to an employee from outside a company. A distinction is being made when discriminatory emails are sent by a company’s employees, or are forwarded to others even without comment.

    “Holding a company accountable for an employee’s views or statements, even when those statements are inconsistent with the company’s values and culture, places a burden on that company to censor its employees to avoid the risk of being branded as a discriminatory lender,” said Andrea Mitchell, managing partner at Mitchell Sandler, who represented American Bank of Oklahoma.

    “There are limits on an employer’s ability to prevent staff from receiving racially insensitive emails or sharing personal views to exercise their right to freedom of expression,” she added.

    Still, legal experts are quick to point out that discrimination is against the law. Employees have no First Amendment rights to assert when using a company’s communication system.

    “If there are racist jokes or an employee saying they’re proud to be white, they’re not going to have much of a case on free speech grounds because no one is punishing the employee for saying it. They’re just using it as evidence to bolster claims of discriminatory intent,” said David E. Bernstein, a law professor at George Mason University School of Law.

    Lisa Rice, president and CEO of the National Fair Housing Alliance, recalled working at the Toledo Fair Housing Center nearly two decades ago and routinely sending requests for emails, text messages and audio and video recordings that included a list of specific racial slurs.

    “We’ve always been able to use public statements, verbal or written, as evidence in fair housing and fair lending cases,” said Rice. “You can request for emails to be turned over and those emails can be used as evidence and as evidence of discrimination. And they might even be used as evidence of discriminatory intent.”

    She added that regulators “may not have gone full throttle” in using emails in the past to bolster claims of intentional discrimination.

    To be clear, racist emails are found in a minority of redlining cases currently being brought by the DOJ.

    Though searching hundreds of thousands of emails or texts is a ponderous task, sophisticated tools, including those that utilize artificial intelligence, can make it much easier to root out racist terms. In some cases there may be just a handful of racist emails out of hundreds of thousands.

    “This is old-school redlining using new techniques,” said Ken Thomas, president of Community Development Fund Advisors and an expert on the Community Reinvestment Act, which requires that banks lend to low- and moderate-income communities. Among the LMI population, about 60% are minorities, he said.

    If there are racist jokes or an employee saying they’re proud to be white, they’re not going to have much of a case on free speech grounds because no one is punishing the employee for saying it. They’re just using it as evidence to bolster claims of discriminatory intent.

    David E. Bernstein, professor at George Mason University School of Law

    Thomas said regulators are searching for the digital-age equivalent of a smoking gun.

    “They are checking emails, Instagram and text messages, looking across the board at all communications, period,” said Thomas. “It’s more than a smoking gun. It’s a gun with fingerprints and blood stains on it.”

    Bernstein agreed, adding that the emails typically are used as supplementary evidence to get a bank or lender to agree to a settlement rather than have a case go to trial.

    “Some of the emails may actually signal a racially charged environment where you wouldn’t really trust the people not to be discriminatory and some may just be from a few adolescent-types sending silly or stupid jokes that they really shouldn’t be sending, but either way it’s not gonna look good to a jury or the public,” he said. “If it ever got to a jury, the government says, ‘Look, here are these five emails that show the racist environment people are working in.’ That’s a very effective tactic.”

    Since Attorney General Merrick Garland announced the Combat Redlining Initiative in 2021, the department has secured over $122 million from 12 banks and mortgage lenders to resolve redlining allegations. The Justice Department is working with its civil rights division and U.S. attorneys’ offices in coordination with the Office of the Comptroller of the Currency and the CFPB. Garland has said the DOJ has 25 redlining cases in its pipeline.

    Garland has spoken about how lenders are breaking the law by redlining and he has put a priority on cracking down on lenders to redress past wrongs. He also has highlighted how the gap in homeownership rates is wider today than in the 1960s. The homeownership rate for whites currently is 74% compared with 45% for Blacks, a 29-point gap, according to the U.S. Census Bureau. In 1960, the homeownership rate was 65% for whites and 38% for Blacks, a 27-point gap.

    The gap in homeownership is wider now than before the passage of the Fair Housing Act of 1968, which bans discrimination in home lending. That’s the law that the DOJ typically uses to bring discrimination cases against lenders. Additionally, the CFPB has jurisdiction over the Equal Credit Opportunity Act, which prohibits discrimination in any aspect of a credit transaction.

    “Redlining remains a persistent form of discrimination that harms minority communities,” Garland said at a news conference in 2021, when the DOJ first announced its redlining initiative.

    He also has stated that “redlining is a practice from a bygone era, runs contrary to the principles of equity and justice, and has no place in our economy today.”

    Rice said that the increase in redlining cases suggests that lenders need more training in compliance management and fair lending.

    “Every single year the federal regulatory agencies conduct fair lending training and HUD provides all kinds of training on best practices in fair housing to learn about what are the best practices and what you should and shouldn’t do,” she said.

    Still, some experts have voiced concerns that incendiary emails have become a centerpiece of some fair lending investigations.

    “Federal regulators have effectively investigated and pursued redlining claims for decades without the need for combing through emails and text messages that are entirely unrelated to lending and branching,” said Mitchell, the attorney for American Bank of Oklahoma.

    She also suggests banks push back against claims that are false and inflammatory or that harm a bank’s reputation.

    In the case of American Bank of Oklahoma, the Justice Department made a reference in a complaint filed with the courts to the 1921 Tulsa Race Massacre in which white rioters killed as many as 300 people, according to some accounts. The tragedy destroyed the city’s Black business district called the Greenwood District.

    The $313 million-asset bank in Collinsville, Oklahoma, vehemently objected to any link between the current redlining allegations against it and the massacre given that the bank was founded in 1998 — nearly 80 years after the massacre occurred. A magistrate judge sided with the bank, and struck the two paragraphs from the complaint that mentioned the massacre. The rest of the order remained intact.

    There also is a concern that the use of racist emails has the e ect of branding a company as racist even as settlement agreements require that lenders build relationships and extend credit in minority communities.

    In the case of American Bank of Oklahoma, its settlement requires it to lend $1 million in Black and Hispanic communities in Tulsa.

    “There’s obviously all sorts of unintended consequences,” said Bernstein, the law professor at George Mason University.

    “It’s an interesting paradox. We’re going to announce you’re racist and said now go lend to people who we just told shouldn’t trust you. They’re making it much harder for these companies to lend and get people to borrow from them, or to recruit members of minority groups on their staff,” he added.

    [ad_2]

    Kate Berry

    Source link

  • San Jose police officer out after internal probe finds racist text messages

    San Jose police officer out after internal probe finds racist text messages

    [ad_1]

    A San Jose police officer is off the force after an internal investigation revealed a slew of racist text messages sent to another officer, including one that said, “I hate black people,” The Times has confirmed.

    Mark McNamara resigned Wednesday after being notified of the probe into his texts, Police Chief Anthony Mata told the Mercury News.

    In a statement, Mata said the “disgusting” text messages were discovered during an unrelated investigation, and a separate probe was immediately launched looking into the texts.

    “There is zero tolerance for even a single expression of racial bias at the San José Police Department,” Mata said. “If any employee’s racial bias rears its ugly head, rest assured that I will take immediate action to ensure they are not part of this organization.”

    Mata’s statement included 10 pages of texts sent by McNamara, many of which were presumably referencing a March 2022 incident in which McNamara shot K’aun Green, a Black college student, after Green helped break up a fight at a taqueria near San Jose State University.

    One text, dated a day after the shooting, said “N— wanted to carry a gun in the Wild West.” A following text said “Not on my watch haha.”

    Green sued the city, and McNamara sent texts attacking his legal team.

    “They should all be bowing to me and bringing me gifts since I saved a fellow n— by making his rich as f—. Otherwise he woulda lived a life of poverty and crime,” one text said.

    The texts were sent to a current employee of the police department, who responded with “concerning dialogue,” according to Mata. That employee was placed on administrative leave pending an internal investigation.

    “There is nothing more sickening than a person in power abusing their position,” said San Jose Mayor Matt Mahan in a statement. “I will sleep better tonight knowing that this individual is no longer carrying a badge and gun.”

    [ad_2]

    Jack Flemming

    Source link