ReportWire

Tag: Big Data

  • Former BLS chief warns Powell is “flying blind” at a pivotal time for the Fed | Fortune

    The Federal Reserve faces an unprecedented challenge as it prepares to set interest rates next week—making its decision with almost no economic data available.

    The government shutdown has halted the release of most U.S. economic statistics, including the monthly jobs report. However, the Fed also recently lost access to one of its main private sources of backup data. 

    Payroll-processing giant ADP quietly stopped sharing its internal data with the central bank in late August, leaving Fed economists without a real-time measure that had covered about one-fifth of the nation’s private workforce. For years, the feed had served as a real-time check on job-market conditions between the Bureau of Labor Statistics’ monthly reports. Its sudden disappearance, first reported by the Wall Street Journal, could leave the Fed “flying blind,” former Bureau of Labor Statistics commissioner Erica Groshen said.

    Groshen told Fortune that, in her decades working at the BLS and inside the Fed, the loss of ADP data is “very concerning for monetary policy.”

    The economist warned that at a moment when policymakers are already navigating a fragile economy—Fed Chair Jerome Powell has said multiple times that there is no current “risk-free path” to avoid recession or stagflation—the data blackout raises the risk of serious missteps. 

    “The Fed could overtighten or under-tighten,” Groshen said. “Those actions are often taken too little and too late, but with less information, they’d be even more likely to be taken too little too late.” 

    Rupture after years of collaboration

    Since at least 2018, ADP has provided anonymized payroll and earnings data to the Fed for free, allowing staff economists to construct a weekly measure of employment trends. The partnership is well-known to both Fed insiders and casual market watchers. However, according to The American Prospect, ADP suspended access shortly after Fed Governor Christopher Waller cited the data in an Aug. 28 speech about the cooling labor market.

    Powell has since asked ADP to restore the arrangement, according to The American Prospect

    Representatives at ADP did not respond to Fortune’s request for comment. The Fed declined to comment.

    Groshen said there are several plausible reasons why ADP might have pulled the plug. One possibility, she said, is that the company found a methodological issue in its data and wanted to fix it before continuing to share information used in monetary policy. 

    “That would actually be a responsible decision,” she told Fortune, noting that private firms have more flexibility than federal agencies but less institutional obligation to be transparent about errors.

    Another explanation, Groshen said, could be internal or reputational pressure. After Waller mentioned the collaboration publicly, ADP may have worried about how it looked to clients or shareholders. 

    “You could imagine investors saying, ‘Why are we giving this away for free? The Fed has money,’” she said. The company might also have wanted to avoid being seen as influencing central-bank decisions, especially in a politically charged environment.

    Whatever the motivation, Groshen said the episode underscores how fragile public-private data relationships remain. Without clear frameworks or long-term agreements, companies can withdraw at any time.

    “If policymakers build systems around data that can vanish overnight,” she said, “that’s a real vulnerability for economic governance.”

    A data blackout at a critical moment

    The timing could hardly be worse. 

    On Thursday next week, the Federal Open Market Committee meets to decide whether to lower interest rates again, following a long-awaited quarter-point cut in September. With the BLS pausing most releases under its shutdown contingency plan, official figures on employment, joblessness, and wages have been delayed—starting with the September report and possibly extending into October.

    In the absence of real-time data, Fed economists are relying on a patchwork of alternatives: state unemployment filings, regional bank surveys, and anecdotal reports from business contacts. Groshen called those “useful but incomplete,” adding that the lack of consistent statistical baselines makes monetary policy far more error-prone.

    She advocated for the BLS to receive “multiyear funding” from Congress so that it could stay open even during government shutdowns. 

    “I hope that one silver lining to all these difficulties will be a realization on the part of all the stakeholders, including Congress and the public, that our statistical system is essential infrastructure that needs some loving care at the moment,” Groshen said.

    Eva Roytburg

    Source link

  • Louisiana Hands Meta a Tax Break and Power for Its Biggest Data Center

    The agreement sets out hiring timelines that the company must also hit to receive these tax incentives: Meta can receive the highest property tax exemption as long as it hires the equivalent of 300 “full-time” jobs by 2030, 450 by 2032, 475 by 2033 and 500 by December 31, 2034.

    Louisiana’s agreements ask for more than some other states’ tax subsidies. According to Good Jobs First, nearly half of state tax subsidies for data centers don’t require any new jobs to be created. But Miller has concerns that the tax breaks were not necessary at all to entice a company as large as Meta. “While everyone likes to avoid taxes, they’re not going to hire people in Richland [Parish] just because they’re going to get a tax break,” Miller says.

    Louisiana had already amended a tax rebate to create an exemption for data centers in 2024 to entice Meta; in its latest iteration, it says data centers can receive a full sales tax exemption for equipment purchases in the state as long as they hire 50 full-time jobs and invest at least $200 million by July 1, 2029. A separate contract viewed by WIRED affirms that this applies to the Richland Parish data center, in addition to the PILOT agreement.

    Good Jobs First says that at least 10 states have subsidies for data centers that are worth more than $100 million each, and “have suffered estimated losses of $100 million each in tax revenue for data centers,” according to its data. In total, these states forgo more than $3 billion in taxes annually for data centers. Texas revised the cost of its data center subsidy in 2025 from $130 million to $1 billion. In 2024, a pause on data center subsidies was passed in Georgia but vetoed by governor Brian Kemp.

    The Franklin Farms site in Holly Ridge, the area of Richland Parish where Meta’s data center is being built, was purchased by Louisiana specifically for economic development projects. In its ground lease with Meta, Louisiana offered the 1,400-acre plot to the company for $12 million, which the lease says was the cost to the state of acquiring and maintaining the land. The lease also says Meta’s $732,000 a year “rent” is “credit toward the Base Purchase Price,” meaning the company will have paid for the property by a little over 16 years into its 30-year lease.

    The price for the potential sale would be slightly higher if Meta does not reach minimum hiring and investment thresholds: As an example, the lease says if Meta only spends $4 billion in the state instead of $5 billion, the property would end up costing it $19 million. Louisiana Economic Development reserves the right to reclaim the property if Meta doesn’t invest at least $3.75 billion and hire the equivalent of 225 “full-time” jobs by 2028. When asked if Meta plans to purchase the property, Clayton said, “We’ll keep you updated on our future plans for this site.”

    Meta’s presence has already caused land values to jump. A nearby tract of 4,000 acres of land in Holly Ridge is for sale for $160 million, or $40,000 per acre—more than 4.5 times the price paid by Louisiana for the data center’s site.

    But there’s also a concern that Meta could delay or abandon the data center project. The PILOT agreement its subsidiary signed with the state says the company’s timeline will depend on “numerous factors outside of the control of the lessee, such as market orientation and demand, competition, availability of qualified laborers to construct and/or weather conditions.”

    “My general fear is that too many data centers are being built,” Miller says. “That means some of the data centers are just going to be abandoned by the owners.”

    She says in the scenario that Big Tech cuts back investments in data centers, Meta would not even be able to find another buyer. “Essentially, the state will be stuck with this warehouse full of computers,” Miller says.

    Update: 9/22/2025, 12:50 PM EDT: Wired has clarified the subhead to reflect how critics perceive the data center.

    Roshan Abraham

    Source link

  • Why It’s Time to Rethink the Health Data Economy | Entrepreneur

    Opinions expressed by Entrepreneur contributors are their own.

    One of the most valuable commodities of our time now flows through our algorithms, powers the devices tracking our movements and fuels the newest health innovations — our data.

    The list of items tracked is staggering, ranging from every step we take to our heartbeats and everything in between.

    These moments are fueling a booming healthtech economy built on a skewed exchange: People generate data, and companies extract the value.

    Related: Why Proactivity With Data Security and Privacy Is More Important Than Ever — and How to Be on Top of It

    Under pressure

    However, that architecture is cracking. Almost 193 million people are estimated to have been affected by the largest healthcare breach on record, the 2024 Change Healthcare attack. In 2024 alone, at least 14 data breaches hit one million patient records, with almost 238 million people exposed across these incidents. If our most valuable asset can leak at that scale, it would be fair to say data extraction isn’t just a moral grey area, but it’s operationally unsound as well.

    In recent times, consumers have begun telling founders what they want instead: control. In April 2025, Pew reported 55% of U.S. citizens wanted more personal control over how AI is used in their own lives. This showcases a demand for agency in the new systems shaping our healthcare choices.

    There is one fundamental thing entrepreneurs and founders should understand when building healthcare platforms today — treating your contributors as stakeholders rather than subjects will go a long way.

    This means building products and policies where value flows inwards, not just outwards. The form can be as direct as paying for contributions, or strategically, by granting early access to features, premium analytics and dashboards or credits that unlock opportunities in research and care.

    The bottom line is alignment. Richer, more consistent streams of high-quality data are generated when people feel they have an element of ownership. This richer data makes better algorithms, and better algorithms deliver products that justify the relationship.

    Transparency is the friend of alignment

    Make the data flows legible in the product: Tell people what you collect and why, where it goes and how long it stays there. Replace vague consent boxes with optional permissions that let a person authorize one use of their data and decline another, and show, in the product, how those toggles change access. When people can see and steer the flow, privacy stops being a legal document and becomes an experience.

    Private companies are not the only ones who can benefit from implementing such systems, with public-sector research leaning into the same logic. The NIH’s All of Us program is designed to return value to participants while opening access for researchers. It has more than 866,000 participants, creating one of the most diverse health datasets in the world. It is clear that when participation is treated as a partnership, rather than a data grab, both the company and the individual benefit.

    Related: What Brands and Consumers Can Do to Build a Privacy-First Digital Future

    Ownership models

    Switzerland is a great example of why ownership models matter. The country’s MIDATA initiative enables individuals to maintain their own health records, contribute to research on their own terms and govern the platform as members.

    We see many companies built using blockchain technologies that often discuss delegating ownership of data, but traditional institutions can also take a leaf out of that book. You don’t have to tokenize anything to learn from that structure.

    The shift begins with the story you tell. Instead of asking users for data so you can build, reframe it, ask them to build with you, and allow users to share the value their data creates. Map your data flows and surface them in the product itself.

    By designing an incentive mechanism that is simple to understand and sustainable to manage, one that puts people at the center of the process, you will reap the rewards later and ensure you have the backing of your users as well.

    One of the most valuable commodities of our time now flows through our algorithms, powers the devices tracking our movements and fuels the newest health innovations — our data.

    The list of items tracked is staggering, ranging from every step we take to our heartbeats and everything in between.

    These moments are fueling a booming healthtech economy built on a skewed exchange: People generate data, and companies extract the value.

    The rest of this article is locked.

    Join Entrepreneur+ today for access.

    Christopher Crecelius

    Source link

  • 5 Data-Driven Trends Shaping the Future of Ecommerce | Entrepreneur

    Opinions expressed by Entrepreneur contributors are their own.

    Data and analytics have become the driving force behind successful competition across industries. In this article, we’ll focus specifically on the role of data in the future of ecommerce.

    What follows is a discussion of some of the key ways in which data relates to and supports the major emerging trends shaping today’s and tomorrow’s ecommerce.

    Related: How Ecommerce Businesses Are Leveraging Web Data to Understand Their Customers and Stay Ahead of the Competition

    Trend 1: Personalization and context

    Personalization has been a major trend in ecommerce for years. However, with the improvement of data technology, the speed and quality of personalized offers are reaching new levels. More advanced personalization engines push the envelope by also incorporating data points like seasonal trends, weather patterns and local events. For instance, a customer may get a recipe suggestion based on data predicting a rainy day ahead.

    To expand their reach beyond their own platforms, savvy retailers have been working diligently to acquire more contextual data. Tracking social media sentiment, monitoring how competitors are pricing their products, staying abreast of broad market trends — you name it. These alternative data sources help them construct a far richer understanding of their customer base. And when those estimates prove reasonably accurate, they can refine everything from inventory management to pricing strategies.

    Trend 2: AI and the smarts behind the interface

    Ecommerce and the magic of AI have been walking hand in hand for some time now. And it’s not just about deploying credible and flexible chatbots to shoulder some of the more formulaic customer support. Today, AI is used even in such vital initiatives as reinforcing entire supply chains. Still, the effectiveness of these applications is completely reliant on the quality and quantity of data that feeds into them.

    To function well, conversational commerce platforms require a substantial amount of customer interaction data to train their NLP models. In addition to “understanding” customers’ words, they must be able to grasp the actual intentions behind those words. For instance, to distinguish a casual browser from a serious buyer, these models need to constantly graze on successful sales dialogues, customer service chats and even samples of failed transactions to get a grip on what tends to trigger breakdowns in communication.

    Meanwhile, AI-based predictive analytics help avoid overstocking while keeping stock-outs at a minimum. By drawing on historical transaction data, inventory levels, outside market signals and economic trends, these systems can be harnessed to anticipate demand with unprecedented accuracy.

    For retailers that want to benefit from comprehensive AI systems, the data requirements are substantial. Such systems require clean, structured data from multiple sources, including customer relationship management systems, inventory databases, financial records and third-party market intelligence.

    Related: How Your Online Business Can Use AI to Improve Sales

    Trend 3: Rising data security concerns

    While ecommerce platforms manage increasingly granular customer data, cybercriminals are devising schemes to target these high-value assets for themselves. Recent breaches affecting major retailers have highlighted the critical importance of data security, not just as a technical concern, but as a fundamental business requirement.

    The GDPR, the CCPA and other legal requirements don’t let companies off the hook until they’re able to prove compliance with mandatory practices like maintaining detailed records of what data they collect, how they use it and who they share it with. Along with staying on the right side of the law, platforms that effectively ensure compliance gain an extra asset of customer trust by signalling their commitment to transparency.

    Thus, security-minded companies are embracing zero-trust security frameworks, encryption for data transmission and data storage protocols and similar advanced measures to protect customer information.

    Trend 4: Sustainability goals

    Research shows that over 70% of consumers are willing to pay premium prices for environmentally responsible products. The time when marketing buzzwords and “greenwashing” still work is passing. Savvy consumers, who are increasingly skeptical of non-committal statements about sustainability, are driving demand for unprecedented levels of transparency in supply chains and manufacturing processes.

    To make carbon tracking across entire supply chains viable, companies must, at a minimum, gather data from suppliers, shipping companies and even customers’ delivery preferences. The most progressive retailers use this data to offer things like:

    • Carbon-neutral shipping options

    • Low-emission delivery routes

    • Environmental impact scores for individual products

    The data requirements extend beyond environmental metrics, though. If sustainability is really put front and center, the entire product lifecycle — from raw material sourcing to packaging materials and end-of-life disposal — must be tracked as well. Another significant advantage for retailers is that the same data systems used for tracking environmental impact can also be leveraged to identify cost savings, supplier risks, and even to initiate circular economy initiatives.

    Related: How to Make Your Ecommerce Business Truly Sustainable (and Why It’s Important)

    Trend 5: Mobile commerce — a crucial data frontier

    Mobile commerce now makes up the bulk of transactions online, and the potential for data analysis to improve its results is vast. Factors like touch patterns, location data, app usage habits and responses to push notifications are ready to be tapped into by enterprising retailers. Location data, for example, enables ecommerce platforms to do things like adjust inventory displays based on regional preferences, optimize delivery options for specific neighbourhoods or coordinate online promotions with events scheduled at nearby brick-and-mortar stores.

    Mobile platforms also generate real-time behavioral data that allows for immediate responses. A good example of this is utilizing mobile analytics (with data streaming in from multiple touchpoints) to identify customers struggling with the checkout process and offering help, rather than waiting for a formal complaint to be made.

    The trends reshaping ecommerce all share one thing in common: They’re only as effective as the data strategies that undergird them. And companies that recognize this connection and invest accordingly won’t just participate in the future of ecommerce — they’ll define it.

    The upshot of this is that in the coming decade, the ecommerce leaders won’t necessarily be those with the biggest marketing spend or the flashiest products. More likely, they’ll be the ones that strategically utilize their resources to bulk up their data capacity.

    Julius Černiauskas

    Source link

  • Microsoft and Meta profits are soaring but their stocks are sagging because both companies aren’t building data centers fast enough

    Microsoft and Meta profits are soaring but their stocks are sagging because both companies aren’t building data centers fast enough

    NEW YORK (AP) — Wall Street is feeling the downside of high expectations on Thursday, as Microsoft and Meta Platforms drag U.S. stock indexes lower despite delivering strong profits for the summer.

    The S&P 500 was down 1.6% in midday trading and on track for its worst day in nearly eight weeks, falling further from its record set earlier this month. The Dow Jones Industrial Average was down 418 points, or 1%, as of 11:15 a.m. Eastern time. The Nasdaq composite was 2.4% lower and heading for a second straight loss after setting its latest all-time high.

    Microsoft reported bigger profit growth for the latest quarter than analysts expected. Its revenue also topped forecasts, but its stock nevertheless sank 6% as investors and analysts scrutinized for possible disappointments. Many centered on Microsoft’s estimate for upcoming growth in its Azure cloud-computing business, which fell short of some analysts’ expectations.

    The parent company of Facebook, meanwhile, likewise served up a better-than-expected profit report. As with Microsoft, though, that wasn’t enough for the stock to rise. Investors focused on Meta Platforms’ warning that it expects a “significant acceleration” in spending next year as it continues to pour money into developing artificial intelligence. It fell 3.6%.

    Both Microsoft and Meta Platforms have soared in recent years amid a frenzy around AI, and they’re entrenched among Wall Street’s most influential stocks. But such stellar performances have critics saying their stock prices have simply climbed too fast, leaving them too expensive. It’s difficult to meet everyone’s expectations when they’re so high, and Microsoft and Meta were both among Thursday’s heaviest weights on the S&P 500.

    The next two companies in the highly influential group of stocks known as the “Magnificent Seven” to deliver their latest results will be Apple and Amazon. They’re set to report after trading ends for the day, and both fell at least 1.3% on Thursday.

    Earlier this month, Tesla and Alphabet kicked off the Magnificent Seven’s reports with results that investors found impressive enough to reward with higher stock prices. The lone remaining member, Nvidia, will report its results later this earnings season, and its 4.3% drop was Thursday’s heaviest weight on the market after Microsoft.

    The tumble for Big Tech on the last day of October is helping to wipe out the S&P 500’s gain for the month. The index is down 0.7% and on track for its first down month in the last six, even though it set an all-time high during the middle of it.

    Still, it wasn’t a complete washout on Wall Street thanks in part to cruise ships and cigarettes.

    Norwegian Cruise Line Holding steamed 8.2% higher after delivering stronger profit for the latest quarter than analysts expected. The cruise ship operator said it was seeing strong demand from customers across its brands and itineraries, and it raised its profit forecast for the full year of 2024.

    Altria Group rose 7.6% for another one of the S&P 500’s bigger gains after it also beat analysts’ profit expectations. Chief Executive Billy Gifford credited resilience for its Marlboro brand, among other things, and announced a cost-cutting program.

    Oil-and-gas companies also generally rose after the price of a barrel of U.S. crude gained 1.3% to recoup some of its losses for the week and for the year so far. ConocoPhillips jumped 4.9%, and Exxon Mobil gained 1%.

    In the bond market, Treasury yields continued their climb following a mixed set of reports on the U.S. economy.

    One report said a measure of inflation that the Federal Reserve likes to use slowed to 2.1% in September from 2.3%. That’s almost all the way back to the Fed’s 2% target, though underlying trends after ignoring food and energy costs were a touch hotter than economists expected.

    A separate report said growth in workers’ wages and benefits slowed during the summer. That could put less pressure on upcoming inflation. A third report, meanwhile, said fewer U.S. workers applied for unemployment benefits last week. That’s an indication that the number of layoffs remains relatively low across the country.

    Treasury yields swiveled up and down several times following the reports before climbing. The yield on the 10-year Treasury rose to 4.31% from 4.30% late Wednesday. That’s up sharply from the roughly 3.60% level it was at in the middle of last month.

    Yields have been rallying following a string of stronger-than-expected reports on the U.S. economy. Such data bolster hopes that the economy can avoid a recession, particularly now that the Fed is cutting interest rates to support the job market instead of keeping them high to quash high inflation. But the surprising resilience is also forcing traders to downgrade their expectations for how deeply the Fed will ultimately cut rates.

    In stock markets abroad, indexes sank across much of Europe and Asia.

    South Korea’s Kospi dropped 1.5% for one of the larger losses after North Korea test launched a new intercontinental ballistic missile designed to be able to hit the U.S. mainland in a move that was likely meant to grab America’s attention ahead of Election Day.

    Recommended newsletter
    Data Sheet: Stay on top of the business of tech with thoughtful analysis on the industry’s biggest names.
    Sign up here.

    Stan Choe, The Associated Press

    Source link

  • Valkyrie Announces New AI Company, Andromeda, Raises $4.5M in Austin’s Largest Pre-Seed Round

    Valkyrie Announces New AI Company, Andromeda, Raises $4.5M in Austin’s Largest Pre-Seed Round

    A proprietary decision intelligence platform that delivers unbiased plans for national security and high-accountability enterprises

    Valkyrie, a leading provider of innovative applied sciences solutions, today announced the launch of its latest venture, Andromeda — a pioneering decision intelligence platform that is poised to revolutionize high-stakes decision-making.

    Andromeda, a subsidiary of Valkyrie, has successfully raised a $4.5 million pre-seed funding round, making it the largest pre-seed round raised in Austin, Texas, in 2024. This investment will fuel Andromeda’s mission to empower warfighters and other mission-critical decision-makers with unbiased, fact-based insights to drive success in their operations.

    “Valkyrie has continually, yet quietly and tirelessly, developed critical capabilities for industry and defense. Andromeda is a testament to our innovation-driven approach, and our commitment to deliver real value,” said Charlie Burgoyne, CEO of Valkyrie. “The science behind Andromeda has already supported our nation’s security for years, and with its expansion, we’ll soon be tackling the largest challenges across industry as well. Andromeda is shedding light on complex data for those who serve in the shadows.”

    The power of transformers (LLMs) and knowledge graphs will be seamlessly integrated through a proprietary approach to enable the most complex workflows. Andromeda’s unique architecture eliminates the risk of AI hallucinations and ensures the delivery of reliable, actionable insights. This hallmark feature sets Andromeda apart in the market, positioning it as a trusted partner for warfighters and other high-stakes decision-makers.

    Echoing the platform’s value, a Battalion Commander in the United States Special Operations Command shared their firsthand experience: “I’ve seen many mission planning tools, but Andromeda is a game-changer. This platform is integral to deliberate influence planning through the creation of psychological operations’ courses of action (COAs), empowering us to make faster, more informed decisions and increase mission success.”

    “In today’s complex and rapidly evolving operational environments, accurate and trustworthy information is paramount,” said Liz Coufal, co-founder of Valkyrie. “Our platform empowers users to make informed decisions with confidence, enabling them to achieve mission success and safeguard critical interests.”

    The pre-seed funding round was led by a consortium of prominent investors, including Trust Ventures and others. The investment will enable Andromeda to accelerate product development, expand its engineering team, and drive market adoption among key target segments, including the defense and intelligence communities.

    “Decision-makers in the defense community are faced with complex, high-stakes decisions on a daily basis. Filtering through piles of information sources can cost critical time, resources, and even lives,” said Salen Churi, General Partner at Trust Ventures. “We are excited to support Andromeda’s work to empower these decision-makers with fast-moving, fact-based insights to ensure successful high-stakes operations.”

    Andromeda will be available for deployment in private or government cloud, on-premise, and offline environments, ensuring universal access and seamless integration with existing workflows. The platform’s intuitive interface allows users to easily create prompts, leverage internal and external data sources, and customize output formats, including reports and dashboards.

    For more information about Andromeda and its decision intelligence capabilities, please visit www.andromeda.us.

    About Andromeda

    Established in 2024, Andromeda, a Valkyrie company, offers a decision intelligence platform that instantly delivers unbiased strategic plans for critical scenarios, also known as Courses-of-Actions (COAs), from a prompt. By integrating custom language models and a proprietary knowledge graph, Andromeda provides hallucination-free insights, actionable plans, and interactive knowledge graph visualizations. With features like source attribution and a thesis finder, Andromeda ensures reliability and eliminates bias. Trusted by the national security community and high-accountability enterprises, Andromeda empowers high-stakes decision-makers with unbiased, fact-based intelligence for mission success. Visit www.andromeda.us for more details.

    About Valkyrie

    Valkyrie is an applied science firm that builds industry-defining AI and ML models through our services, product, impact, and investment arms. We specialize in taking the latest advancements in AI research and applying it to solve our clients’ challenges through model development or product deployment. Our work can be found across industry from SiriusXM, Activision, Chubb Insurance, and the Department of Defense, to name a few. For more information, visit www.valkyrie.ai

    Source: Valkyrie Andromeda LLC

    Source link

  • The AI data center revolution is happening right in your backyard

    The AI data center revolution is happening right in your backyard

    If you use generative AI, you know that it can seem like magic. Chatbots and multimedia models can effortlessly conjure up poems or high-res videos at the snap of a finger. 

    But AI models’ speedy outputs and sleek interfaces mask the enormous amount of physical infrastructure behind them—and as AI continues to grow, the data centers and power plants that AI is built on are starting to get widespread attention outside of the industry.

    Earlier this week, I took a train to Orangeburg, New York, a sleepy lower Hudson Valley suburb just 25 miles from Fortune’s newsroom in downtown Manhattan. I was there to visit one of a wave of AI infrastructure projects popping up across the country—Orangeburg is the future home of data center company DataBank’s newest site, named LGA3. 

    DataBank already operates two data centers in the New York metro area: one in Newark, New Jersey, and one in Chelsea, Manhattan. But LGA3 will be by far its biggest site—a $250 million, 200,000-square-foot facility drawing up to 45 megawatts of energy to power five massive data halls packed to the gills with computer chips.

    The facility won’t open until next year, but tenants have already been booking space—most notably New Jersey-based AI startup CoreWeave, which recently secured an eye-watering $19 billion valuation and has already reserved almost half of LGA3’s capacity.

    “The explosive growth in artificial intelligence has required a complete reevaluation of traditional data centers to meet demand for next-generation compute requirements, and this new data center campus provides some of the most advanced new technologies that will allow us to deliver for our customers,” CoreWeave founder Ben Venturo wrote to me in a note.

    My taxi from the train station took no less than four wrong turns as we wound our way past farmhouses and office parks to the LGA3 construction site, wedged between an electrical substation and the New Jersey state line. Hopping out of the car, my first impression was the sheer size of the building. LGA3 looked about the size of a New York city block, a massive, single-story hall with high ceilings—I wouldn’t be surprised if you could fit a commercial jet inside. 

    ‘Addicted to technology’

    DataBank CEO Raul Martynek greeted me on the way in, wearing clear-rimmed glasses and a lavender button-down. Martynek has been in the internet infrastructure industry for decades, almost since the advent of the commercial internet in the 1990s. He’s been with Databank since 2017, overseeing the company’s 69 data centers across the U.S. and U.K. Martynek told me that he hasn’t seen an explosion in demand for digital infrastructure like the one AI is creating since the dot-com bubble of the late ’90s.

    “Humans are addicted to technology, period. And ultimately, for the data center sector, what we do is we enable humans to deploy more technology,” Martynek told me. “If you deploy more technology, you need more fiber, more cell towers, more data centers. And for this particular phase that we’re in with AI, data centers are the bottleneck.”

    Companies are shelling out billions to build out new data centers for cloud computing—such as this Amazon facility in Ashburn, Virginia.

    Nathan Howard/Bloomberg

    A huge increase in demand from AI has catapulted data centers into front-page headlines. Martynek explained to me that most things we do online nowadays—from accessing images on our phones to scrolling social media to prompting ChatGPT—involve physical hardware more than we realize. Wi-Fi routers and cell towers are constantly sending signals through underground fiber optic cables to data centers and remote servers, accessing stored information and keeping the internet humming.

    “The internet is a network, right? Information gets sent out over fiber optic cables as photons. And they travel around the world at close to the speed of light,” Martynek said. “I was hanging out with a network guy last night saying, ‘What do you do?’ He said, ‘We’re plumbers, right?’”

    And these days, being a plumber is a good business. Exponential increases in the amount of data being generated for and by the internet over the past 20 years—and expectations that AI will only speed things up even more—mean that space to store all that information is in high demand.

    “This device didn’t exist before 2007,” Martynek told me, pointing to his iPhone. “So think about how much content and how many applications have been created [by it.] All that stuff ends up in a data center…That’s the physical ecosystem.”

    Courtesy SourceCode Communications

    AI boosts need for building space

    Data centers might not be the sexiest projects, but a surge in demand from AI companies is bringing in big money, heavy press coverage, and some of the biggest names in construction. DataBank alone has spent around $4.5 billion on data center projects since 2016. Tishman Speyer, the real estate company building LGA3, is one of the highest-profile names in the business: it worked on the World Trade Center and Chicago’s John Hancock Center, and it owns Rockefeller Center, too. A low-slung data bank next door to a suburban Little League baseball complex might seem an odd addition to its portfolio, but it’s betting that data centers will prove to be just as important as skyscrapers.

    When I first got the invite to visit, the location surprised me. New York? Home to some of the highest real estate and energy prices in the country? Wouldn’t it be cheaper to build this in the middle of the desert, where land is cheaper and there’s access to bottom-dollar renewable energy? 

    But Martynek explained that for many customers, it’s just not practical to be located thousands of miles away from one of the most important parts of your business. New York is one of the country’s largest data center markets, with around 800 megawatts of capacity currently online, much of it catering to finance and tech companies who depend on nearby computing capacity to build and trade around the clock.

    “It’s not practical for a data center to be in the middle of nowhere—there’s too much latency,” Martynek said, referring to delays in the response time between computers and offsite data centers. “Too many things can happen along the way.”

    “Data centers have tended to cluster around metropolitan areas,” he continued. “New York has always been a pretty big data center market. That’s really a function of the population and a function of the businesses—if you’re JPMorgan, you don’t want your data center in Omaha.”

    Public policy is also a factor. New York Governor Kathy Hochul unveiled the state’s Empire AI initiative earlier this year, which earmarked over $400 million to fund, among other things, infrastructure such as data center projects.

    Donning a hard hat and reflective vest, I walked around the half-built structure with two construction managers. They pointed out the airplane hangar-sized area where the banks of computer chips would eventually be installed, along with the ventilation and water cooling to keep them from overheating. 

    Once construction is finished up, CoreWeave and DataBank’s other customers will start installing their chips, and DataBank expects the facility to be up and running in full by early next year. Once it’s online, CoreWeave will start leasing out its computing capacity to tech startups and other AI companies. Martynek told me DataBank hasn’t had any trouble finding customers.

    “We signed the contract with CoreWeave last year. This building didn’t even exist then—it was just dirt. That’s how in-demand this product is,” Martynek said. “There’s a frenzy.”

    As the saying goes, strike while the iron’s hot—DataBank is already putting together plans for another site right next door, LGA4. Next time you’re driving around town, keep an eye on the nondescript buildings in your area: The AI data center boom might be closer than you think.

    Dylan Sloan

    Source link

  • Why Vertical Integration Allows Leaders to Actually Control Their Data | Entrepreneur

    Why Vertical Integration Allows Leaders to Actually Control Their Data | Entrepreneur

    Opinions expressed by Entrepreneur contributors are their own.

    The parable of “The Blind Men and The Elephant” tells the story of six blind men who come across an elephant for the first time. They each examine a different part of the elephant. The trunk. The ears. The tusks… You get the gist.

    Consequently, each person comes to a very different conclusion about what an elephant is.

    They are all partly right — but also entirely wrong.

    The moral of this story is simple. Different perspectives and incomplete information can lead to varying — and often inaccurate — interpretations. It’s an old story. But it’s a fable we should pay close attention to in our modern world. Especially when working with data.

    Related: Using Data Analytics Will Transform Your Business. Here’s How.

    Less will (almost) always be more

    Business leaders worldwide want to use their data to make better decisions and get more accurate insights into their business.

    But often, businesses will have multiple layers in their tech stacks. Some are new. Some are old. Some are integrated. Some are totally siloed. Each of these layers captures different data. And you get inaccurate insights when these pieces aren’t talking to each other.

    Businesses have access to more data than ever before. But quantity doesn’t equal quality.

    For many organizations, the actual quality of their data is being diluted. You create misaligned incentives by having so many different elements in your tech stack. One aspect of your stack may tell you one thing, but then the next part can contradict it. We see this a lot in advertising technology. A myriad of different buying platforms, data partners, publishers, analytics tools, CRM, segmentation tools and more. Often, it becomes so messy over time that it’s hard to get the actionable data you need to create insights, actions, and business impact.

    Access to lots of data sets does not equal good data. Looking at data sets in silos is an easy way to paint the wrong picture, which increases the likelihood of poor decision-making.

    Data is exciting, but you need to be able to view it in a single place, which is why I think the principles of vertical integration should be implemented here.

    Vertical thinking

    Vertical integration is a strategy that allows a company to streamline its operations by taking direct ownership of processes. In other words, it allows you to control your own destiny. In theory, it gives businesses greater efficiencies, reduced costs, and more control of the manufacturing and distribution process. Tesla is a famous example of this model practice.

    Tesla implemented vertical integration across its business structure — but with a major focus on two key aspects: battery production and energy storage. Tesla knew that batteries were critical to EVs – and that success hinged on owning a highly contested battery supply chain. This enterprise allowed Tesla to leverage its expertise in battery technology and apply it to the energy storage market, creating synergies and shared resources across different product lines.

    It does everything from designing the cars, building the tech, and making its own chips to selling the cars. Everything is in-house, meaning supply chain issues or manufacturing partners don’t slow it down. This integration has helped the company scale its operations, drive technological advancements, and position itself as a sustainable transportation and energy solutions leader.

    Owning your data can work in the same way. But without the need to build a multi-billion dollar gigafactory in Nevada.

    By taking control of your tech stack and ownership of your information, you gain a more holistic view of what is happening with your business. And you also insulate yourself from issues in the outside world.

    This is invaluable on its own. But this singular view becomes even more powerful when you factor in the exponential growth of AI and ML technologies. Applying these tools to a vertically integrated data set can transform a business and unlock previously unknowable insights.

    Essentially if your data is disparate and not well integrated, you can never get to the “Unknown Unknowns”—the things you didn’t even know to ask about. There are patterns invisible to human eyes. You can only see these with good data.

    More insights. More efficiency. More control. Vertical integration offers businesses control of their supply chains. But there are challenges, too. Even if you aren’t building huge factories, you need investment and support to make the necessary changes. And most importantly, you need to ensure that becoming self-reliant doesn’t mean you get tunnel vision and lose sight of developments outside your business.

    Related: 8 Ways Data Analytics Can Revolutionize Your Business

    Data and elephants

    If you look at different data sets in isolation, you will likely emerge with an incorrect idea of what an elephant — aka your business — really is.

    You might be partly right in places, but you will also be entirely wrong.

    Vertical integration allows leaders to see the full picture of their business — from customer to creative, hoof to tusk.

    Kristopher Tait

    Source link

  • WTF Fun Fact 13536 – AI and Rogue Waves

    WTF Fun Fact 13536 – AI and Rogue Waves

    For centuries, sailors have whispered tales of monstrous rogue waves capable of splitting ships and damaging oil rigs. These maritime myths turned real with the documented 26-meter-high rogue wave at Draupner oil platform in 1995.

    Fast forward to 2023, and researchers at the University of Copenhagen and the University of Victoria have harnessed the power of artificial intelligence (AI) to predict these oceanic giants. They’ve developed a revolutionary formula using data from over a billion waves spanning 700 years, transforming maritime safety.

    Decoding Rogue Waves: A Data-Driven Approach

    The quest to understand rogue waves led researchers to explore vast ocean data. They focused on rogue waves, twice the size of surrounding waves, and even the extreme ones over 20 meters high. By analyzing data from buoys across the US and its territories, they amassed more than a billion wave records, equivalent to 700 years of ocean activity.

    Using machine learning, the researchers crafted an algorithm to identify rogue wave causes. They discovered that rogue waves occur more frequently than imagined, with about one monster wave daily at random ocean locations. However, not all are the colossal 20-meter giants feared by mariners.

    AI as a New-Age Oceanographer

    The study stands out for its use of AI, particularly symbolic regression. Unlike traditional AI methods that offer single predictions, this approach yields an equation. It’s akin to Kepler deciphering planetary movements from Tycho Brahe’s astronomical data, but with AI analyzing waves.

    The AI examined over a billion waves and formulated an equation, providing a “recipe” for rogue waves. This groundbreaking method offers a transparent algorithm, aligning with physics laws, and enhances human understanding beyond the typical AI black box.

    Contrary to popular belief that rogue waves stem from energy-stealing wave combinations, this research points to “linear superposition” as the primary cause. Known since the 1700s, this phenomenon occurs when two wave systems intersect, amplifying each other momentarily.

    The study’s data supports this long-standing theory, offering a new perspective on rogue wave formation.

    Towards Safer Maritime Journeys

    This AI-driven algorithm is a boon for the shipping industry, constantly navigating potential dangers at sea. With approximately 50,000 cargo ships sailing globally, this tool enables route planning that accounts for the risk of rogue waves. Shipping companies can now use the algorithm for risk assessment and choose safer routes accordingly.

    The research, algorithm, and utilized weather and wave data are publicly accessible. This openness allows entities like weather services and public authorities to calculate rogue wave probabilities easily. The study’s transparency in intermediate calculations sets it apart from typical AI models, enhancing our understanding of these oceanic phenomena.

    The University of Copenhagen’s groundbreaking research, blending AI with oceanography, marks a significant advancement in our understanding of rogue waves. By transforming a massive wave database into a clear, physics-aligned equation, this study not only demystifies a long-standing maritime mystery but also paves the way for safer sea travels. The algorithm’s potential to predict these maritime monsters will be a crucial tool for the global shipping industry, heralding a new era of informed and safer ocean navigation.

     WTF fun facts

    Source: “AI finds formula on how to predict monster waves” — ScienceDaily

    WTF

    Source link

  • WTF Fun Fact 13611 – Turning Data Into Music

    WTF Fun Fact 13611 – Turning Data Into Music

    Scientists are turning data into music to see if it can help us understand large and intricate datasets in new and interesting ways.

    Tampere University and Eastern Washington University’s groundbreaking “data-to-music” algorithm research transforms intricate digital data into captivating sounds. And the researchers have presented a novel and potentially revolutionary approach to data comprehension.

    Sonic Data Interpretation

    At TAUCHI (Tampere Unit for Computer-Human Interaction) in Finland and Eastern Washington University in the USA, a dynamic research group dedicated half a decade to exploring the merits of data conversion into musical sounds. Funded by Business Finland, their groundbreaking findings have been encapsulated in a recent research paper.

    Jonathan Middleton, DMA, the main contributor to the study, serves as a professor of music theory and composition at Eastern Washington University. Simultaneously, he is recognized as a visiting researcher at Tampere University. Under his guidance, the research pivoted on enhancing user engagement with intricate data variables using “data-to-music” algorithms. To exemplify their approach, the team utilized data extracted from Finnish meteorological records.

    Middleton emphasizes the transformative potential of their findings. “In today’s digital era, as data collection and deciphering become intertwined with our routine, introducing fresh avenues for data interpretation becomes crucial.” So, he champions the concept of a ‘fourth’ dimension in data interpretation, emphasizing the potential of musical characteristics.

    Turning Data Into Music

    Music is not just an art form; it captivates, entertains, and resonates with human emotions. It enhances the experience of films, video games, live performances, and more. Now, imagine the potential of harnessing music’s emotive power to make sense of complex data sets.

    Picture a basic linear graph displaying heart rate data. Now, amplify that visualization with a three-dimensional representation enriched with numbers, hues, and patterns. But the true marvel unfolds when a fourth dimension is introduced, where one can audibly engage with this data. Middleton’s quest revolves around identifying which mode or dimension maximizes understanding and interpretation of the data.

    For businesses and entities that anchor their strategies on data interpretation to tailor offerings, Middleton’s research presents profound implications. So he believes that their findings lay the groundwork for data analysts worldwide to tap into this fourth, audial dimension, enhancing understanding and decision-making.

    A Symphony of Data Possibilities

    As data continues to drive decision-making processes across industries, the quest for innovative interpretation techniques remains relentless. Tampere University and Eastern Washington University’s “data-to-music” research illuminates a path forward. With the potential to hear and emotionally connect with data, industries can achieve a deeper understanding, making data analysis not just a technical task but also an engaging sensory experience.

     WTF fun facts

    Source: “Complex data becomes easier to interpret when transformed into music” — ScienceDaily

    WTF

    Source link

  • Relying Solely on Your Gut to Make Big Business Decisions Could Cost Your Career | Entrepreneur

    Relying Solely on Your Gut to Make Big Business Decisions Could Cost Your Career | Entrepreneur

    Opinions expressed by Entrepreneur contributors are their own.

    No matter your industry, Big Data and analytics are fundamentally changing how businesses operate and make decisions. Throw in a culture of rapid digital transformation, and the expectations for leadership roles are shifting at an unprecedented rate.

    It’s no longer sufficient for C-suite executives and senior leaders to simply excel in traditional management skills such as strategic vision and people management. Now, the currency of effective leadership also includes an intimate understanding of data analytics.

    Consider just a few of the ways that Big Data and analytics are driving decisions in the modern workplace:

    • Customer analysis to better understand customer needs, preferences and behaviors.
    • Predictive models to forecast future trends or performance.
    • Risk analysis to identify potential threats and opportunities.

    A recent survey proves the power of data analytics: 44% of executives consider data crucial for strategic decision-making, while 37% believe it offers deeper insights into their business.

    As a C-suite executive, an ability to interpret data-driven insights from these types of analyses can create a competitive advantage for climbing the corporate ladder. But how should you get started with data analytics?

    Related: Five Ways Big Data Can Help Your Business Succeed

    The evolving landscape of modern leadership

    There was a time when the primary expectations for senior leaders were aspects like visionary thinking, strategic planning and people management. While these skills remain vital, the technological revolution has introduced a new dimension to leadership: data literacy.

    The arrival of “Big Data” and the learning models that drive it have made data literacy increasingly crucial for executives. Data literacy involves reading, analyzing and communicating insights from vast volumes of data. It requires understanding statistics and techniques like machine learning and natural language processing (NLP).

    As a senior leader, your ability to understand and use this data boosts team effectiveness and positions you as a forward-thinking executive. Consider KPIs — the backbone of performance management. By interpreting the data, gain insights into team performance and areas for improvement.

    This enables you to make better decisions and have more informed conversations with your team members. Plus, understanding the basics of machine learning helps you identify opportunities for automation or optimization that may have been missed.

    Strategies for integrating data analytics into leadership

    If data literacy is a new skill that can boost your career, the question becomes: How can you cultivate this skill to help your resume rise up the ranks? For senior leaders interested in harnessing the power of data analytics for career growth, here are some strategies to consider:

    Develop a data-driven mindset

    Before diving into tools and techniques, developing a data-driven mindset is crucial. Start by asking data-based questions in meetings, challenging assumptions with empirical evidence, and encouraging your team to do the same. This helps to foster a culture of data-driven decision-making and sets the tone for deeper exploration later on.

    Proactively seek out opportunities to learn

    Data analytics is a broad field — from basic spreadsheet software to sophisticated machine learning algorithms. Identify which skills you need to learn to use data effectively, then look for sources such as online courses or internal training to build those competencies.

    Collaborate with data experts

    Don’t isolate yourself; instead, make it a point to collaborate with your organization’s data scientists (if you have any), analysts or other data professionals. They can provide insights that are not immediately apparent and guide you through the complexities of data interpretation.

    Plus, by adding a roster of skilled data advisors to your network, you can benefit from their expertise and experience.

    Implement data-driven projects in your current role

    Once you’re comfortable with the basics, initiate a data-driven project within your team or department.

    It could be anything from improving customer experience based on feedback data to optimizing supply chain logistics. Such projects provide practical experience and showcase your leadership in adapting to the new data-centric business environment.

    Track and showcase your success

    Nothing speaks louder than results. As you implement data-driven initiatives, track the outcomes meticulously. Be prepared to showcase these successes in performance reviews or when seeking a promotion, as they make a compelling case for your leadership capabilities in a data-driven era.

    Related: The Pivotal Role Of Big Data In E-Commerce

    Gain a competitive edge through data analytics

    If you’re going to compete, data analytics is no longer a luxury — it’s a necessity for senior leaders aspiring for career advancement. Mastering this skill set enhances your decision-making and differentiates you in the eyes of stakeholders and hiring committees.

    To stay ahead of the pack, you need to understand and interpret data trends proficiently. The more comfortable and confident you are with data-driven insights, the more likely you can capitalize on opportunities before others do.

    • Lead with data, not just instincts: Enrich your leadership instincts with empirical data for a more balanced and credible decision-making approach.
    • Collaborate with data experts: Build a network of data professionals within your organization to enhance your data literacy and garner insights.
    • Implement data-driven projects: Showcase your newfound skills by leading a data-centric project within your team or department.
    • Track and showcase success: Measure the outcomes of your data-driven initiatives and be prepared to present them in performance reviews or job interviews.
    • Make data analytics a leadership trait: Adopt data literacy as a core leadership trait, on par with qualities like strategic thinking and empathy.

    Start today by learning the basics of data analytics and how to use it in your decision-making process. And while you grow in your understanding and skill level, never forget to show the value of data-driven initiatives in your organization. Doing so will help you become a more influential leader that the world needs today.

    Tim Madden

    Source link

  • How Smart Technologies Are Revolutionizing Supply Chain Management | Entrepreneur

    How Smart Technologies Are Revolutionizing Supply Chain Management | Entrepreneur

    Opinions expressed by Entrepreneur contributors are their own.

    Supply chain management plays a pivotal role in the success of any enterprise. Entrepreneurs and business owners are constantly seeking innovative ways to optimize their supply chains, reduce operational costs and enhance overall efficiency.

    This is where the Internet of Things (IoT) and smart technologies step in to revolutionize the field of supply chain management.

    Related: Supply Chain Management: The Game-Changing Innovations That Are Shaping the Industry

    The role of IoT in supply chain management

    IoT refers to the interconnected network of physical devices, vehicles, buildings and other objects embedded with sensors, software and network connectivity that enables them to collect and exchange data. When integrated into supply chain operations, IoT can bring about a transformational shift in the way businesses manage their logistics and distribution processes.

    1. Real-time tracking and visibility:

    In the context of supply chain management, real-time tracking and visibility are game-changers. IoT devices, such as GPS sensors and RFID tags, provide continuous data streams that allow entrepreneurs to monitor their goods at every stage of the supply chain journey. This means they can pinpoint the exact location of products, monitor their condition and track their movement from manufacturer to distributor to retailer.

    The benefits are twofold. First, this level of visibility significantly reduces the risk of theft and losses since any anomalies or deviations from the planned route can trigger immediate alerts. Second, it offers valuable insights into the overall efficiency of the supply chain. By analyzing data on delivery times, transportation routes and storage conditions, entrepreneurs can identify areas for improvement, optimize routes and ensure that goods reach their destination faster and in better condition.

    2. Inventory management:

    IoT sensors are capable of automating inventory management with unprecedented accuracy and efficiency. These sensors can monitor inventory levels in real time and send automatic alerts when stock is running low or when products are approaching their expiration date.

    This proactive approach to inventory management has numerous advantages. It prevents stockouts, ensuring that businesses never run out of essential supplies, which can be especially critical for just-in-time manufacturing processes. It also helps in reducing overstock situations, which can tie up capital and storage space. Ultimately, this level of control not only optimizes storage space but also improves cash flow management by reducing excess inventory costs.

    3. Predictive maintenance:

    Within the IoT ecosystem, smart technologies can predict when machinery and equipment are likely to fail. IoT sensors on machines can continuously monitor their performance, collecting data on factors such as temperature, vibration and energy consumption. By analyzing this data, predictive maintenance algorithms can identify patterns that indicate when a machine is deviating from its normal operating conditions, suggesting a potential breakdown.

    This predictive capability is a game-changer for supply chain operations. Instead of relying on scheduled maintenance, which can be costly and lead to unnecessary downtime, businesses can address maintenance needs proactively. This minimizes downtime, reduces repair costs and ensures smooth operations. In essence, it keeps the supply chain running like a well-oiled machine.

    4. Reduced costs:

    IoT-enabled supply chains are inherently more efficient. The real-time data provided by IoT devices allows businesses to identify bottlenecks and inefficiencies quickly. For example, if goods are consistently delayed at a particular warehouse or if delivery routes are suboptimal, these issues can be promptly addressed.

    By optimizing processes and streamlining operations, businesses can significantly reduce costs in various aspects of the supply chain, including transportation, warehousing and labor. For instance, they can minimize fuel consumption by optimizing delivery routes, reduce warehousing costs by better managing inventory levels and enhance labor productivity by automating routine tasks. This cost reduction not only improves profitability but also enables businesses to remain competitive in a rapidly changing market.

    Related: IoT: Introduction And Disruption Of Supply Chain Management

    The power of data analytics

    IoT generates an immense amount of data, but its true potential is unlocked through data analytics. Entrepreneurs can harness this data to gain valuable insights into consumer behavior, demand patterns and supply chain performance. By leveraging advanced analytics tools and machine learning algorithms, businesses can make data-driven decisions that enhance their competitiveness.

    Smart technologies beyond IoT

    In addition to IoT, several other smart technologies are making waves in supply chain management:

    1. Blockchain:

    Blockchain technology is revolutionizing supply chain management by offering secure and transparent tracking of products and transactions throughout the entire supply chain journey. Here’s how it works:

    • Secure and immutable records: Every transaction or movement of products is recorded in a secure and immutable blockchain ledger. This means that once data is entered, it cannot be altered or tampered with. This inherent security ensures the authenticity of records, reducing the risk of fraudulent or deceptive practices.

    • End-to-end transparency: Blockchain provides an unbroken, transparent chain of custody for products. Entrepreneurs can trace the origin of each product, monitor its movement from manufacturer to distributor to retailer and even verify its authenticity. This level of transparency not only reduces the risk of counterfeit goods but also enhances trust among consumers.

    • Smart contracts: Blockchain allows for the implementation of smart contracts, which are self-executing agreements with predefined rules. These contracts can automate various supply chain processes, such as payments, quality inspections and compliance checks. This automation reduces administrative overhead and ensures that contractual obligations are met promptly.

    2. Artificial Intelligence (AI):

    AI-driven algorithms are a powerful tool for optimizing supply chain processes. Here’s how AI can transform supply chain management:

    • Demand prediction: AI algorithms can analyze historical data, market trends and various external factors to predict demand accurately. This enables businesses to adjust their production and inventory levels accordingly, reducing the risk of overstocking or stockouts.

    • Process automation: AI can automate routine and repetitive tasks, such as data entry, order processing and inventory management. This not only reduces labor costs but also minimizes the potential for human errors, improving overall efficiency.

    • Enhanced decision-making: AI can analyze vast amounts of data in real time to make informed decisions. For instance, it can optimize delivery routes based on real-time traffic data or recommend the most cost-effective suppliers. This data-driven decision-making leads to more efficient supply chain operations.

    • Personalized customer service: AI-powered chatbots and customer service platforms can personalize recommendations and resolve customer issues more efficiently. This enhances the customer experience and fosters brand loyalty.

    3. Robotic Process Automation (RPA):

    Robotic Process Automation involves the use of robots and automation technologies to streamline various aspects of supply chain management. Here’s how RPA is making a significant impact:

    • Warehouse operations: Robots can automate tasks within warehouses, such as picking and packing products. They work with precision and consistency, reducing the potential for errors and increasing order accuracy. This not only speeds up order fulfillment but also reduces labor costs.

    • Repetitive Task Automation: RPA can handle repetitive and rule-based tasks, such as data entry, invoice processing and tracking shipments. By automating these tasks, businesses can free up human resources for more strategic activities.

    • Enhanced efficiency: RPA can operate around the clock, ensuring that supply chain operations continue without interruptions. This enhances overall efficiency and reduces lead times.

    • Cost reduction: By automating routine tasks, RPA reduces labor costs and the potential for errors that can lead to additional expenses. It also optimizes resource utilization, ensuring that operations are cost-effective.

    Related: How AI Can Revolutionize Our Broken Supply Chain

    Taiwo Sotikare

    Source link

  • TikTok hit with €345M fine for violating children’s privacy

    TikTok hit with €345M fine for violating children’s privacy

    Press play to listen to this article

    Voiced by artificial intelligence.

    Booming social media application TikTok needs to pay up in Europe for violating children’s privacy.

    The popular Chinese-owned app failed to protect children’s personal information by making their accounts publicly accessible by default and insufficiently tackled risks that under-13 users could access its platform, the Irish Data Protection Commission (DPC) said in a decision published Friday.

    The regulator slapped TikTok with a €345 million fine for breaching the EU’s landmark privacy law, the General Data Protection Regulation (GDPR).

    The penalty comes amid high tensions between the European Union and China, following the EU’s announcement that it plans to probe Chinese state subsidies of electric cars. European Commission Vice President Věra Jourová is also set to visit China next Monday-Tuesday and meet Vice Premier Zhang Guoqing to discuss the two sides’ technology policies, amid growing concerns over Beijing’s data gathering and cyber espionage practices.

    “Alone the fine of [€345 million] is a headline sanction to impose but reflects the extent to which the DPC identified child users were exposed to risk in particular arising from TikTok’s decision at the time to default child user accounts to public settings on registration,” said Helen Dixon, the Irish data protection commissioner, in a written statement.

    The Irish privacy regulator said that, in the period from July to December 2020, TikTok had unlawfully made accounts of users aged 13 to 17 public by default, effectively making it possible for anyone to watch and comment on videos they posted. The company also did not appropriately assess the risks that users under the age of 13 could gain access to its platform. It also found that TikTok is still pushing teenagers joining the platform to make their accounts and videos public through manipulative pop-ups. The regulator ordered the firm to change these misleading designs, known as dark patterns, within the next three months.

    Minors’ accounts could be paired up with unverified adult accounts during the second half of 2020. The authority said the video platform had also previously failed to explain to teenagers the consequences of making their content and accounts public.

    “We respectfully disagree with the decision, particularly the level of the fine imposed,” said Morgan Evans, a TikTok spokesperson. “The [Data Protection Commission]’s criticisms are focused on features and settings that were in place three years ago, and that we made changes to well before the investigation even began, such as setting all under-16 accounts to private by default.”

    TikTok added it will comply with the order to change misleading designs by extending such default-privacy settings to accounts of new users aged 16 and 17 later in September. It will also roll out in the next three months changes to the pop-up young users get when they first post a video.

    The decision marks the largest-ever privacy fine for TikTok, which is now actively used by 134 million Europeans monthly, and the fifth-largest fine imposed on any tech company under the GDPR.

    The platform popular among teenagers has previously faced criticism for insufficiently mitigating harms it poses to its young users, including deadly viral challenges and its addictive algorithm. TikTok — like 18 other online platforms — also now has to limit risks like cyberbullying or face steep fines under the Digital Services Act (DSA).

    The costly fine adds to TikTok’s woes in Europe, after it saw a wave of new restrictions on its use earlier this year due to concerns about its connection to China.

    The social media app, whose parent company ByteDance is based in Beijing, has struggled to quash concerns over its data security. The company said this month it had started moving its European data to a center within the bloc. Yet, it is still under investigation by the Irish Data Protection Commission over the potentially unlawful transfer of European users’ data to China.

    The social media app, whose parent company ByteDance is based in Beijing, has struggled to quash concerns over its data security | Roslan Rahman/AFP via Getty Images

    The Irish data authority in 2021 started probing whether TikTok was respecting children’s privacy requirements. TikTok set up its legal EU headquarters in Dublin in late 2020, meaning the Irish privacy watchdog has been the company’s supervisor for the whole bloc under the GDPR.

    Other national watchdogs weighed in on the investigation over the summer via the European Data Protection Board (EDPB), after two German privacy agencies and Italy’s regulator disagreed with Ireland’s initial findings. The group instructed Ireland to sanction TikTok for nudging its users toward public accounts in its misleading pop-ups.

    The board of European regulators also had “serious doubts” that TikTok’s measures to keep under-13 users off its platform were effective in the second half of 2020. The EDPB said the mechanisms “could be easily circumvented” and that TikTok was not checking ages “in a sufficiently systematic manner” for existing users. The group said, however, that it couldn’t find an infringement because of a lack of information available during their cooperation process.

    The United Kingdom’s data regulator in April fined TikTok £12.7 million (€14.8 million) for letting children under 13 on its platform and using their data. The company also received a €750,000 fine in 2021 from the Dutch privacy authority for failing to protect Dutch children by not having a privacy policy in their native language.

    This article has been updated.

    Clothilde Goujard

    Source link

  • 7 Metrics to Evaluate the Success of Your Marketing Campaigns | Entrepreneur

    7 Metrics to Evaluate the Success of Your Marketing Campaigns | Entrepreneur

    Opinions expressed by Entrepreneur contributors are their own.

    The ability to quantify the effectiveness of marketing campaigns and strategies is no longer a luxury; it’s a strategic imperative that separates thriving businesses from those merely treading water. This article highlights the significance of measuring marketing ROI and explores key performance indicators (KPIs) that can help steer your marketing efforts toward tangible success.

    The importance of measuring marketing ROI

    Defining marketing ROI involves determining the profitability of an investment in marketing by comparing the gained revenue against the incurred costs. This calculation is central to understanding the impact of marketing campaigns on the bottom line. By evaluating ROI, businesses gain insights into which marketing efforts are delivering the most significant returns and can allocate resources accordingly.

    Measuring ROI is particularly crucial for marketing agencies and their clients. In an era driven by data, both parties benefit from the ability to make decisions grounded in evidence. A data-driven approach allows marketing agencies to fine-tune their strategies and tailor them to specific audiences, ultimately leading to more effective campaigns.

    As per a McKinsey survey, companies that base their decisions on data and analytics exhibit remarkable statistics: They are 23 times more prone to customer acquisition, six times more adept at customer retention and stand a staggering 19 times higher chance of achieving profitability.

    For clients, it ensures that their investments generate tangible results, fostering a sense of trust and satisfaction in the agency’s work.

    Related: How to Gauge Marketing Success in a Shifting Business Landscape

    Challenges in measuring marketing ROI

    While the benefits of measuring marketing ROI are substantial, challenges often arise in the measurement process. Tracking the diverse touchpoints of modern marketing campaigns, accurately attributing conversions to specific channels and accounting for indirect impacts can be intricate tasks.

    Another issue that may arise is that different businesses and industries have varying sales cycles and customer journeys. This complicates the establishment of a standardized ROI measurement methodology.

    Addressing these challenges requires a combination of strategy and technology. Marketing agencies must adopt data integration techniques that consolidate information from various platforms to form a comprehensive view of customer interactions.

    7 key performance indicators (KPIs) for marketing success

    As we have established so far, effective marketing is more than just creative campaigns; it’s about making informed decisions based on quantifiable metrics. These key performance indicators (KPIs) serve as beacons in the vast sea of marketing data. This section further explores seven crucial KPIs that can help with marketing success!

    1. Website traffic and user engagement metrics

    In the digital realm, a brand’s online presence is paramount — more so than ever. Website traffic acts as a foundational KPI, encompassing metrics such as page views, unique visitors and bounce rate.

    Beyond mere numbers, these metrics signify the extent of a campaign’s reach. But traffic alone isn’t enough; user engagement metrics like time on page and click-through rate (CTR) offer a deeper perspective. These KPIs reveal not only the quantity but the quality of interactions, allowing businesses to refine content strategies and enhance user experiences.

    2. Conversion rate and goal completions

    The ultimate goal of marketing is to convert potential customers into active ones. The conversion rate, a pivotal KPI, measures the percentage of visitors who take a desired action — a purchase, sign-up or download. In different industries, the average conversion rate for landing pages is around 2.35%. But the top 25% of performers achieve rates of 5.31% or higher. For optimal results, aiming for the top 10% is advisable, as these pages boast conversion rates of 11.45% or more.

    Paired with goal completions, which signal the successful attainment of predetermined objectives, these KPIs provide a holistic view of marketing effectiveness. They illuminate the alignment between strategies and outcomes, ensuring that campaigns resonate with target audiences and contribute to business objectives.

    3. Customer acquisition cost (CAC)

    Understanding the cost of acquiring a new customer is pivotal. Customer acquisition cost (CAC) quantifies the investment required for each new customer. A study by Invesp highlights that businesses are willing to spend five times more to acquire new customers than to retain existing ones.

    This KPI holds the key to evaluating the efficiency of marketing spending. Lowering CAC directly enhances return on investment (ROI) — a reduction in acquisition expenses translates to higher profitability. Strategies for optimizing CAC include refining targeting methods, improving conversion rates and nurturing leads more effectively.

    Related: What Is Good Data-Driven Marketing? Here Are 5 Examples of What Big Data Can Do.

    4. Customer lifetime value (CLV)

    Customer lifetime value (CLV) is a transformative KPI that gauges the potential value a customer brings throughout their engagement journey. Research suggests that companies with the strongest omnichannel customer engagement strategies retain an average of 89% of their customers. In essence, Customer Lifetime Value (CLV) is closely intertwined with omnichannel strategies in the realm of marketing.

    Effectively utilizing multiple channels to engage customers throughout their journey substantially contributes to long-term customer relationships. In this context, CLV becomes a vital metric that measures the potential value of a customer across these various engagement touchpoints.

    5. Return on advertising spend (ROAS)

    Return on advertising spend (ROAS) helps evaluate the effectiveness of advertising campaigns by comparing generated revenue to advertising expenditure. A high ROAS signifies optimal budget allocation and campaign efficiency. Conversely, a low ROAS prompts a reevaluation of advertising strategies, ensuring resources are channeled into campaigns that deliver substantial returns.

    6. Social media engagement and influence

    Engagement signifies the degree of user interaction with a brand’s content, measured by metrics like likes, comments, shares and clicks. It reflects your content’s resonance and the sense of community it fosters. On the other hand, influence goes beyond interaction, gauging a brand’s capacity to shape opinions and sway decisions, often propelled by collaborations with influencers. Combining these two can nurture customer loyalty and extend your brand’s impact beyond its immediate audience.

    7. Email marketing performance

    Email marketing remains an indispensable facet of digital communication, with compelling statistics underscoring its significance. Average open rates across industries hover around 38.49%, while click-through rates stand at approximately 2.91%, indicating the potency of well-crafted email campaigns to capture recipients’ attention and drive engagement.

    Effective email marketing strategies encompass personalized content, compelling subject lines and valuable offers, harnessing their potential to foster customer retention, lead nurturing and revenue growth.

    Data analytics and measurement tools

    Data analytics plays a pivotal role in capturing, interpreting and deriving insights from marketing data. Analytics empowers businesses to make informed decisions based on evidence rather than assumptions. This shift towards data-driven decision-making enhances marketing strategies by aligning them with customer preferences and behavior.

    Related: The Most Important Marketing Metric You’re Not Measuring

    Popular measurement tools for marketing ROI

    Several tools have gained popularity for their effectiveness in measuring marketing ROI. For example, Google Analytics offers comprehensive insights into website traffic, user behavior and conversion rates. Google Tag Manager simplifies the tracking and implementation of analytics tags. SEMrush aids competitive analysis, keyword research and SEO optimization. Hyros stands out for its advanced attribution modeling capabilities, offering a holistic view of customer journeys. Google Data Studio facilitates visualizing data and creating dynamic reports. These tools empower marketers to decipher performance, optimize strategies and enhance ROI by making informed data-driven decisions.

    In a landscape where marketing strategies can make or break a business, measuring ROI has emerged as an indispensable practice. The discussed KPIs provide a comprehensive framework for assessing marketing success and guiding decision-making. As marketing agencies and businesses continue to navigate the dynamic marketing ecosystem, embracing data-driven methodologies and measurement tools will be instrumental in achieving sustainable growth.

    Alex Quin

    Source link

  • US watchdog teases crackdown on data brokers that sell Americans’ personal information | CNN Business

    US watchdog teases crackdown on data brokers that sell Americans’ personal information | CNN Business


    Washington
    CNN
     — 

    The US government plans to rein in the vast data broker industry with new, privacy-focused regulations that aim to safeguard millions of Americans’ personal information from data breaches, violent criminals and even artificial intelligence chatbots.

    The coming proposal by the Consumer Financial Protection Bureau would extend existing regulations that govern credit reports, arrest records and other data to what the agency describes as the “surveillance industry,” or the sprawling economy of businesses that traffic in increasingly digitized personal information.

    The potential rules, which are not yet public or final, could bar data brokers from selling certain types of consumer information — including a person’s income or their criminal and payment history — except in specific circumstances, the CFPB said.

    The push could also see new restrictions on the sale of personal information such as Social Security numbers, names and addresses, which the CFPB said data brokers often buy from the major credit reporting bureaus to create their own profiles on individual consumers.

    Issued under the Fair Credit Reporting Act, the regulations would seek to ensure that data brokers selling that sensitive information do so only for valid financial purposes such as employment background checks or credit decisions, and not for unrelated purposes that may allow third parties to use the data to, for example, train AI algorithms or chatbots, the CFPB said.

    The announcement follows an agency study into the data broker industry this year that found widespread concerns about how consumer data is being collected, used and shared. The inquiry received numerous submissions from the public warning about the disproportionate risks that unregulated data sharing can have on minorities, seniors, immigrants and victims of domestic violence.

    “Reports about monetization of sensitive information — everything from the financial details of members of the U.S. military to lists of specific people experiencing dementia — are particularly worrisome when data is powering ‘artificial intelligence’ and other automated decision-making about our lives,” CFPB Director Rohit Chopra said in a statement. “The CFPB will be taking steps to ensure that modern-day data brokers in the surveillance industry know that they cannot engage in illegal collection and sharing of our data.”

    The CFPB’s proposal will first be floated with a group of small businesses for feedback before being publicly unveiled in a formal rulemaking, the agency said.

    The CFPB isn’t the only US agency clamping down on the massive data industry. Last year, the Federal Trade Commission proposed a sweeping set of regulations that may restrict how all businesses collect and use consumer data, taking aim at what FTC Chair Lina Khan has described as the “persistent tracking and routinized surveillance of individuals.”

    The agency initiatives reflect how Congress has continually failed to produce a comprehensive, national-level consumer privacy law, despite years of lawmaker negotiations and the rise of privacy regulations overseas that increasingly affect US businesses.

    Source link

  • Senior Executives Are Falling Behind The Digital Curve — Here’s What It Takes to Stay Ahead. | Entrepreneur

    Senior Executives Are Falling Behind The Digital Curve — Here’s What It Takes to Stay Ahead. | Entrepreneur

    Opinions expressed by Entrepreneur contributors are their own.

    As digitalization continues to shape the modern business landscape, senior executives are now more than ever required to stay current and relevant. A Deloitte survey found that 67% of executives felt “uncomfortable” accessing or using data from advanced analytic systems.

    For executives to stay ahead of the curve, having a solid understanding of digital literacy is essential. This includes being knowledgeable about new technologies and their potential challenges and risks. It also means being up-to-date on the latest software, hardware, and digital tools trends that can help increase efficiency, productivity and communication.

    So what major areas of digital literacy should senior executives focus on to stay current and relevant?

    Related: 11 Leadership Guidelines For The Digital Age

    Accepting AI and machine learning

    If the time it took for companies to adopt the internet felt short, the adoption of artificial intelligence (AI) has felt even shorter.

    AI and machine learning aren’t necessarily new technologies; their practical uses are evolving quickly. Senior executives should be familiar with AI and its capabilities to understand how it can help improve the customer experience and overall business operations.

    94% of corporate leaders surveyed by Deloitte believe that AI will profoundly affect their businesses within the next three years. Learning what AI offers in data analytics and workflow automation can help senior executives stay relevant in the digital era.

    Embracing big data analytics

    Analytics has always been a powerful tool for businesses, but the rise of big data has made them even more valuable. With big data comes an increased potential to gain insights into customer behavior and preferences that can help inform better decisions and strategies. A NewVantage Partners survey found that 97% of senior executives are boosting their investments in data initiatives.

    Senior executives should be familiar with the fundamentals of big data analytics and understand how they can use big data to their advantage. They should be able to leverage analytics to improve decision-making, identify market trends, and uncover customer needs more efficiently.

    Implementing agile methodology

    The digital landscape is constantly changing, and senior executives need to stay agile in order to keep up with the latest trends.

    Agile methodologies — those that focus on rapid delivery and iterative development cycles — are becoming increasingly common in the enterprise. Senior executives should be familiar with agile principles and be able to apply them to their own organizations.

    Agile-focused leaders are often better equipped to handle change and adapt quickly to disruptive market trends. They can also provide better guidance for their teams, enabling them to move faster and successfully transition into the digital era.

    Investing in cybersecurity and privacy

    As digital transformation takes hold, the need for strong security protocols and privacy practices becomes even more critical. A KPMG survey found that 39% of senior executives see their company as unprepared to handle cyber attacks. Executives need to understand the basics of cybersecurity and ensure their organization complies with applicable regulations.

    It’s also critical to understand the importance of protecting customer data and how companies can ensure they’re providing a secure user experience. This includes understanding the latest security trends and best practices and how to respond quickly in case of a cyber attack.

    Related: The Emerging Cybersecurity Trends In 2023

    Upskilling and reskilling the workforce

    Enhancing one’s digital literacy goes beyond learning for yourself — it should also include staying up-to-date on the skills necessary to successfully train and manage a digital workforce. No matter where your teams work, they want to know that they are being equipped and empowered to perform their jobs effectively.

    Senior executives should strive to learn about the latest upskilling and reskilling trends in digital fields, such as AI, machine learning, and automation. They should also understand how these technologies can improve overall business operations — from boosting employee productivity and efficiency to unlocking new insights into customer behavior.

    Supercharging innovation across the organization

    Technology can be an excellent tool for driving innovation and creativity, but senior executives must understand how to leverage it effectively. This means having a clear vision of where the organization is going and using digital tools to help achieve that goal.

    Creativity should be encouraged across departments, and senior executives should be able to provide guidance and resources that help teams develop innovative solutions.

    Engaging in the digital marketplace

    Business is moving from the brick-and-mortar world to the digital environment, and executives need to understand how to engage effectively in this new marketplace. Investing time and resources into how a company can engage online and utilizing digital channels to reach customers is essential for success in the digital world.

    Senior executives should grasp the evolving e-commerce landscape well and be familiar with best practices such as leveraging customer reviews, providing targeted promotions, and using data-driven insights to create personalized experiences.

    For instance, understanding how to use social media platforms and other digital tools to build customer relationships can help create a more loyal base of followers.

    The bottom line

    The world is changing and leaders need to understand the importance of staying ahead of the digital curve.

    As senior executives, it’s essential to recognize that embracing digital transformation is no longer an optional strategy — it’s a requirement for success in today’s economy. Fortunately, the resources and knowledge are available to help you get up to speed quickly, so start exploring and learning today.

    Your teams — and your customers — will thank you.

    Tim Madden

    Source link

  • Want to Build Trust? Focus on Data Privacy | Entrepreneur

    Want to Build Trust? Focus on Data Privacy | Entrepreneur

    Opinions expressed by Entrepreneur contributors are their own.

    Did you know that 422.14 million people were impacted by data compromises in 2022 in the US alone? With increasing instances of data breaches and unauthorized data use, it’s no wonder that data privacy has become the top priority for customers. A Cisco report states that 76% of people say they would not buy from a company they do not trust with their data.

    As a result, you must apply comprehensive data safety measures to retain your customers’ trust in your company. This is especially crucial in marketing because customer data collection, storage and analysis drive modern marketing.

    If you are wondering how you can take measures to safeguard customer data while marketing and bolster their trust in your company, you’re at the right place! I’ve put together this guide to help you understand data privacy in marketing in simple terms. It will also help you devise cost-effective strategies to adhere to customer data regulations.

    Related: 8 Ways a Data Breach Could Take Out Your Company Tomorrow

    Understanding data privacy in marketing

    Data privacy in marketing refers to protecting and responsibly handling consumer information collected throughout your marketing endeavors. Why is this crucial?

    Marketers engage in a plethora of data collection and handling activities. This includes personal and behavioral data that help them gain insights into their target audience and provide personalized experiences. However, the threat of data privacy breaches is growing daily, and a breach can have severe consequences.

    These include reputational damage, legal repercussions, financial loss and loss of customer trust. So, every marketer must prioritize customer data security and comply with privacy regulations. Let’s understand how this works.

    Related: Schools Are Getting Slammed By Cyberattacks and Student Data Is No Longer Safe. Here’s How to Navigate Cybersecurity in the New, Digital Classroom

    Ensuring customer data protection

    First, you must ensure your website is secure, and consumer data is used for legitimate purposes. After all, the Harvard Business Review found that 84% of consumers avoid shopping from brands with suspicious websites. But there’s more! Here are the critical methods you must apply carefully to safeguard your consumer’s privacy and trust.

    1. Compliance with data protection regulations

    The most important step is to ensure compliance with data protection regulations, such as the General Data Protection Regulation (GDPR) in Europe and the California Consumer Privacy Act (CCPA) in the United States. These two acts are the primary customer data regulation frameworks, and if not followed, they lead to substantial fines and reputational damage.

    CCPA, which was enacted in 2020, grants consumers the right to know about collected personal data and request the deletion of their information. It mandates the organizations to provide a list of third-party organizations that will have access to the data upon request from the customer. Enterprises failing to comply with the regulation can face statutory damages that range from $100 to $750 per consumer per incident.

    GDPR was enacted in 2018. It grants individuals control over their data. Businesses that comply must obtain informed consent for data collection, provide privacy policies and notices and implement measures to protect the data. Noncompliance can result in a fine of €20 million or 4% of total revenue, whichever is higher.

    By incorporating stringent measures as per these laws, you will fulfill the legal requirements and instill confidence in consumers regarding their information and strengthen the brand reputation.

    2. Implementing Multi-factor Authentication (MFA)

    Multi-factor authentication is an added layer of security for enhancing data security in marketing, and it is growing in use due to its effectiveness and ease of use. Under MFA, users must provide multiple forms of verification, such as a password or code sent to their phone number or email.

    As a marketer, you can implement MFA across customer portals, employee access, and administrative dashboard. You can also go a step ahead and implement advanced methods such as biometric systems to strengthen security further.

    With MFA, you can significantly decrease the risk of a data breach by allowing only authorized individuals access to sensitive customer data. This will naturally bolster consumer trust regarding your data handling practices.

    Related: Safeguarding Your Corporate Environment from Social Engineering

    3. Implementing Decentralized Finance

    DeFi is an innovative technology that uses blockchain to create a decentralized ecosystem for secure financial data management. As per a recent survey report by Antier Solutions, about 15-20% of small businesses are already utilizing DeFi services for financing successfully. This indicates that Defi is living up to its promise of safety and reliability. But what makes it so effective, you may ask?

    DeFi platforms use ledger technology and cryptography to decentralize data storage and eliminate potential risks. Moreover, DeFi uses smart contracts to ensure transparency between both entities. Smart contracts are self-executing contracts that automatically execute predefined conditions and are stored in blockchain. You can utilize smart contracts to obtain consent from consumers regarding data collection and usage.

    In contrast to Defi, traditional centralized financing has data storage systems that pose inherent vulnerabilities to data breaches. So, vulnerability at even a single point can allow hackers to extract data.

    4. Prioritize Supplier and Vendor Security

    Marketers who collaborate with third-party suppliers and vendors may provide access to consumer data. If you are amongst them, it is necessary to conduct due diligence when selecting partners. Furthermore, your contracts with them should include provisions that require third parties to comply with the same data protection and privacy standards as you.

    5. Establish Incident Response Plans

    Despite all measures, there’s always a risk of things going south as technologies develop rapidly! So, developing a comprehensive incident response plan is vital as it helps you effectively address and mitigate the impact of any data breaches or privacy issues. Here’s what you must do to establish an effective incident response plan-

    • Start by establishing a cross-functional incident response team. This must comprise individuals from different departments, such as IT, communication, legal, etc., to bring together a range of expertise.
    • You must have an escalation plan ready to ensure incidents are promptly forwarded to the appropriate management level.
    • Develop clear communication protocols for both the internal stakeholders and external parties, such as the affected customers and regulatory authorities.
    • Designate official spokespersons for prompt and transparent communication because it helps maintain trust and demonstrates a commitment to addressing the incident responsibly.

    Lastly, don’t wait for emergencies! Conduct regular tabletop exercises and simulations to test the effectiveness of your plans and identify areas for improvement.

    Summing up

    Data privacy has become the most fundamental aspect of maintaining customer trust and building strong relationships in the era of data-driven digital marketing. Its importance cannot be overstated in a world where data breaches are rising in frequency and customers are increasingly sensitive about their data safety.

    So, remember to analyze and implement the points in this guide carefully and always stay up-to-date with the latest privacy regulations, data security threats, and customer expectations. The area of data safety is constantly evolving, and only the most agile and vigilant marketers will find lasting success.

    Vikas Agrawal

    Source link

  • Statistical Significance: Here Are Some Examples, Types and More | Entrepreneur

    Statistical Significance: Here Are Some Examples, Types and More | Entrepreneur

    Statistical significance is a critical concept in data analysis and research. In essence, it’s a measure that allows researchers to assess whether the results of an experiment or study are due to random chance or whether they indicate a real effect or correlation.

    When a result is statistically significant, the likelihood of the observed outcome happening purely due to chance is very low — below a predetermined threshold, usually represented as the p-value.

    Statistical significance in research and data analysis cannot be overstated. It forms the backbone of decision-making in numerous fields, from clinical trials in healthcare to market research in business.

    Related: The Best Ways to Do Market Research for Your Business Plan | Entrepreneur

    Determining statistical significance helps to differentiate between genuine patterns in data from those that may have appeared by coincidence.

    In doing so, it minimizes the risk of false conclusions and ensures the validity and reliability of the research findings.

    What is statistical significance?

    At the heart of statistical significance lies the process of statistical hypothesis testing.

    Statistical hypothesis testing is a structured method used by statisticians to decide if a body of data supports a specific claim or hypothesis about a population.

    It involves formulating two contrasting hypotheses: the null hypothesis and the alternative hypothesis. The null hypothesis is a statement that assumes no effect or relationship between variables. Conversely, the alternative hypothesis proposes that there is an effect or relationship.

    A key concept associated with hypothesis testing is the p-value.

    The p-value quantifies the probability of obtaining the observed data (or data more extreme) if the null hypothesis is true. It serves as a tool for deciding whether to reject the null hypothesis.

    A small p-value (typically ≤ 0.05) indicates strong evidence against the null hypothesis, and you reject the null hypothesis in favor of the alternative hypothesis.

    Another crucial element is the significance level, often denoted by alpha (α). This is a threshold chosen to determine when you reject the null hypothesis.

    Commonly set at 0.05, the results are deemed statistically significant if the p-value is less than the significance level.

    What are the different types of statistical significance testing?

    There are several statistical significance tests, including one-tailed and two-tailed tests.

    A one-tailed test examines the likelihood of an outcome being higher (or lower) than a specific value. In contrast, a two-tailed test considers both possibilities — that the outcome could be higher or lower. The choice between the two depends on the specifics of the study or experiment.

    T tests are another common type of significance testing. T tests are used to compare the means of two groups and determine if they are significantly different from each other.

    They are instrumental in situations where the sample sizes are small, and the population variance is unknown.

    In hypothesis testing, you must also be wary of type I and type II errors. A type I error (false positive) occurs when you reject a true null hypothesis incorrectly. At the same time, a type II error (false negative) happens when you fail to reject a false null hypothesis.

    Understanding these errors is vital in interpreting the results of statistical significance testing.

    What is the role of sample size and sampling error in statistical significance?

    In statistical analysis, sample size — the number of observations in a sample — is pivotal in obtaining statistically significant results.

    A larger sample tends to give more accurate results because it’s more likely to be representative of the population. In other words, with a larger sample size, the statistical power — the probability of correctly rejecting a false null hypothesis — increases.

    This lessens the likelihood of committing a type II error (failing to reject a false null hypothesis).

    However, increasing the sample size isn’t always practical or cost-effective, and it can sometimes lead to an overly sensitive test that detects statistically significant differences even when they have little practical relevance.

    In conjunction with sample size, understanding the concept of sampling error is vital in interpreting statistical results.

    Sampling error is the difference between a sample statistic that is used to estimate a population parameter and the actual, but unknown, value.

    It arises from the randomness inherent in selecting a sample from a population, and its magnitude tends to decrease as the sample size increases.

    What are some real-world examples of statistical significance at work?

    Statistical significance is a cornerstone concept in many professional fields.

    For instance, researchers use statistical significance in clinical trials to determine whether a medication or treatment is effective.

    Suppose a drug trial results in a lower average illness duration than a placebo. In that case, researchers would use statistical significance testing to discern if the difference is due to the drug’s effectiveness or merely a result of random variation.

    Statistical significance plays a significant role in business, particularly in pricing and market research.

    For instance, if a company changes its product pricing and subsequently observes a change in sales, statistical significance can help determine if the observed difference is a real effect of the new pricing strategy or merely a random fluctuation.

    Related: 10 Pricing Strategies That Can Drastically Improve Sales | Entrepreneur

    In another scenario, consider a large tech company trying to understand the behavior of its users. With vast data sets, statistical significance helps data analysts sift through the noise and identify meaningful trends and patterns that could inform decision-making processes.

    What is the importance of effect size and confidence interval?

    While statistical significance indicates whether an effect exists, the effect size provides a measure of the magnitude of that effect. Effect size is critical when considering the practical significance of a result.

    For instance, a study might find a statistically significant difference in test scores between two groups of students taught using different methods. However, if the score difference is only marginal, it may not have much practical significance, despite its statistical significance.

    A confidence interval, on the other hand, gives an estimated range of values that is likely to include an unknown population parameter. It provides a measure of uncertainty around the estimate of effect size.

    For example, a 95% confidence interval indicates that were the study repeated numerous times, we’d expect the confidence interval to contain the true population parameter 95% of the time.

    Confidence intervals and effect size provide a more holistic view of research results beyond whether an effect is statistically significant.

    What is the role of statistical power in statistical significance?

    In hypothesis testing, statistical power is defined as the probability that a test correctly rejects the null hypothesis when the alternative hypothesis is true. Simply put, it is the likelihood of finding a statistically significant result when there truly is an effect or difference.

    Statistical power is influenced by several factors, including the sample size, the effect size (the magnitude of the difference or relationship you’re testing), the number of variables, and the significance level (the probability of rejecting the null hypothesis when it is true).

    By increasing the sample size or effect size or using a higher significance level, the power of the test will increase. This means there’s a greater chance of detecting an effect or difference when it truly exists, reducing the risk of a type II error.

    In practical terms, a study with low power might fail to detect a genuine effect or difference, leading to a false negative result.

    Conversely, a study with high power has a better chance of detecting an effect when it exists, providing more reliable results and making the research findings more meaningful.

    Common misinterpretations and misuse of statistical significance

    While statistical significance is a valuable tool in research, it can often be misunderstood and misused.

    One common pitfall is the confusion between statistical significance and clinical or practical significance.

    Statistical significance refers to the likelihood that the results are due to chance, whereas clinical significance refers to whether the results have a meaningful, real-world impact.

    A study may find a statistically significant result with little to no real-world relevance; thus, it’s essential to consider both types of significance in interpretation.

    Another common issue is the misinterpretation of p-values. A p-value is a probability, not a measure of the size or importance of an effect.

    A small p-value does not necessarily mean that the effect is large or important; conversely, a large p-value does not mean the effect is small or unimportant.

    Finally, the occurrence of false positives, or type I errors, is a major challenge in statistical testing. A false positive occurs when the null hypothesis is rejected when true, implying an effect or difference when there isn’t one.

    This could lead to faulty conclusions and misinformed decisions. Multiple testing corrections and a thorough understanding of the statistical concepts can help avoid these standard errors, lending credibility and reliability to research findings.

    How to use statistical significance in Excel

    Microsoft Excel, though primarily a spreadsheet tool, is also frequently used for statistical analysis.

    For those who are statisticians or aspiring to be, here’s a simple step-by-step guide to conduct tests of statistical significance using Excel:

    1. Enter your data: Begin by inputting your data into Excel. For example, say you have two sets of data that you want to compare.
    2. Use Excel’s built-in functions: Excel provides a set of functions that can be used to perform statistical tests. For example, to perform a t-test, you could use the function “T.TEST.”
    3. Interpret the results: The result of the “T.TEST” function in Excel is the p-value. You can reject the null hypothesis if this value is less than the significance level (typically 0.05).

    Related: This Comprehensive Microsoft Excel Course Can Turn You into a Whiz for $10 | Entrepreneur

    Here are some tips for statisticians using Excel:

    • Always double-check your data and formulas to ensure accuracy.
    • Use Excel’s built-in functions as much as possible. They are optimized for accuracy and can save you time.
    • Familiarize yourself with the data analysis toolpak in Excel. It’s a powerful resource for conducting statistical analysis.

    Why statistical significance matters to you

    Statistical significance is crucial in various fields — from scientific research to business analytics, healthcare and marketing.

    It is a fundamental concept that assists in the decision-making process by providing a means to determine if a result is likely due to chance or represents a real effect.

    Related: 9 Best Business Analytic Tools in 2023 | Entrepreneur Guide

    Understanding the theory and practical application of statistical significance enables researchers and professionals to make informed decisions based on data.

    It contributes to enhancing research credibility, provides a solid foundation for evidence-based practices and aids in drawing meaningful conclusions from data sets.

    Whether you’re a researcher unveiling new scientific discoveries, a business analyst spotting market trends or a health professional interpreting clinical trial results, statistical significance is an indispensable tool.

    By responsibly interpreting statistical significance and combining it with practical significance, you can continue to make impactful strides in your respective fields.

    To learn more about statistical significance and how it could benefit your business, check out Entrepreneur’s other articles for additional information.

    Entrepreneur Staff

    Source link

  • How to Harness Data for the Underserved Market | Entrepreneur

    How to Harness Data for the Underserved Market | Entrepreneur

    Opinions expressed by Entrepreneur contributors are their own.

    Simply knowing who’s included in our most underserved populations can be a challenge that makes public sector outreach nearly impossible. To overcome these obstacles and better serve consumers, we must first identify the underserved and their needs. Public agencies can use two powerful tools to help reach the most vulnerable: credit and alternative data.

    Defining underserved populations

    The first thing agencies must do is understand what an underserved population is. The Department of Health and Human Services (HHS) has a good definition specific to healthcare: individuals who have experienced healthcare disparities. Healthcare disparities can manifest due to a lack of available services, difficulty accessing care and limited knowledge about navigating the healthcare system or finding providers. Agencies looking to define underserved populations can adapt this definition to their specific fields.

    The Federal Reserve also has a good working definition: people who don’t have access to a bank. This lens is handy because a lack of access to essential banking functions is a significant barrier to receiving other public services. Without a bank account, options to cash checks are limited and often come with additional hurdles like stricter controls, timing requirements, increased fees and more. Those without a bank account also can’t receive direct-deposit benefits or savings interest rates that could help them get ahead. Identifying the unbanked or underbanked first is an excellent way to use data to find and reach more individuals likely underserved by public benefits.

    Research shows underserved populations regularly fit into specific demographic groups. These groups include the unemployed and elderly, veterans, disabled persons, those living below the poverty line and those residing in rural areas. Through a combination of factors, these groups are at the highest risk of needing government benefits while often participating in assistance programs at lower rates.

    Related: Leverage the Power of Data to Boost Your Sales — and Your Customer Connections

    Out-of-reach insights

    As it stands, government agencies could better understand who uses their services. A lack of comprehensive understanding is partly due to outdated privacy laws and red tape; until recently, government websites weren’t allowed to collect cookies on their visitors. Of course, there’s a fine line between privacy protection practices and using data to reach underserved populations better. Still, many government agencies can be more effective in using the data they have at their disposal. Crucial insights may remain out of reach for agencies that struggle to analyze the reams of data that can exist across systems.

    Public sector executives must meet this pervading problem with a viable solution. Veterans are one significantly underserved group — often because states don’t have access to a robust database covering their veteran populations. However, they’re only one of the groups often overlooked by public agencies. And while many agencies are getting better at using digital tools and data analytics, there’s still work to be done. Improving outreach is one way to close this gap, and we can do so through the judicious use of good data.

    Related: Using Data Analytics Will Transform Your Business. Here’s How.

    Data unlocks doors

    The private sector is good at leveraging data to identify and reach its customers. Most brands and companies know the demographic data of their typical consumers — and they’re experts at turning that knowledge into profits. Data can reveal where a company’s target market lives, how it responds to advertising and other key behaviors that better enable retailer outreach. Public sector agencies can operate in the same way.

    For example, take the bus system in Montgomery County, Maryland. The county’s Department of Transportation redesigned its bus system to introduce the Flash. That redesign happened because the agency looked at its proprietary data behind its typical user. Before the redesign, bus riders often had to make multiple transfers, adding inconvenience to their lives.

    The Montgomery County Department of Transportation (MCDOT) reviewed whom this problem impacted, the peak times it affected them, and how the city utilized the busing system. Then, it created new routes, resulting in significantly improved and efficient customer experiences. Innovations like these are precisely what other public sector agencies need to embrace to serve constituents more effectively.

    Related: Redefining Customer Engagement in a World Where Data Privacy Reigns

    Taking action

    Good data is essential to determine the best way to connect with consumers. But how exactly do busy public sector leaders begin implementing a more robust data analytics strategy? External data is readily available through many public sources. Companies like credit reporting agencies have access to a plethora of information on underserved populations. They can help pinpoint the most vulnerable audiences — who they are and what they need — to maximize the good a new outreach program can do.

    Internal usage data may also be key to determining the highest area of need. Public transit is an excellent example: Adding a bus route in an affluent suburb may not be as important as expanding or optimizing routes in a high-density metropolitan area because most suburban people have cars. Agencies will only discover information like this by leveraging data.

    Analytics are particularly valuable when they inform the best strategy to reach those in need. Not all methods work for all audiences; one group may be best reached via email, another may be more open to television ads, and yet another may be most receptive to telephonic outreach. Analytics can provide valuable insights that keep agencies from wasting resources on dead-ends or unnecessary services.

    Quality service begins with informed outreach

    Public services are intended to help the people who need them most. But to meet the mark, we must first know their needs. Improving the customer experience begins with a solid outreach strategy guided by both external and internal data and analytics.

    Modern tools can help us close the gap in need, enhancing the quality of life for the most vulnerable and elevating our society. Data is the engine powering the train toward that goal.

    Scott Straub

    Source link

  • How Data Analytics Can Transform Your Business | Entrepreneur

    How Data Analytics Can Transform Your Business | Entrepreneur

    Opinions expressed by Entrepreneur contributors are their own.

    The digital age has ushered in a new era where data reigns supreme, providing businesses with valuable insights into customer behavior, market trends and overall business performance. In order to thrive in today’s highly competitive landscape, entrepreneurs must not only recognize the significance of data analytics but also leverage its power to drive their organizations forward.

    At its core, data analytics involves the systematic examination of raw data with the purpose of drawing meaningful conclusions. By embracing this approach, businesses gain the ability to understand their operations at a granular level, make data-driven decisions, accurately predict future trends and ultimately foster growth and profitability. Let us delve deeper into the ways in which data analytics can revolutionize your business.

    Related: Eight Ways Data Analytics Can Revolutionize Your Business

    How data analytics can transform your business

    Enhancing customer experience:

    One of the greatest benefits of data analytics lies in its capacity to help businesses better comprehend their customers. By analyzing various data points, such as purchasing habits, social media interactions and website visits, organizations can create comprehensive profiles that encompass customers’ preferences and behaviors. Armed with this knowledge, businesses can tailor their product offerings, personalize marketing messages and ultimately enhance the overall customer experience. Consequently, this leads to increased customer satisfaction, loyalty and a competitive edge in the market.

    Streamlining operations:

    Data analytics serves as a powerful tool for uncovering inefficiencies within a business’s operations. By examining production data, for example, businesses can identify bottlenecks within their manufacturing processes. Similarly, studying sales data may shed light on underperforming products or regions. Armed with these insights, businesses can take the necessary steps to streamline their operations, reducing waste and enhancing overall efficiency. Ultimately, this results in cost savings and improved productivity, thereby giving businesses a competitive advantage.

    Mitigating risks:

    Inherent to any business endeavor is an element of risk. However, data analytics empowers businesses to anticipate and mitigate potential risks effectively. By closely analyzing data, businesses can identify patterns and trends that may indicate forthcoming issues. This allows organizations to take proactive measures, ranging from real-time detection of fraudulent transactions to predicting future market volatility. By staying one step ahead, businesses can better protect their interests, reduce financial losses and ensure long-term stability.

    Guiding strategic decision-making:

    Data analytics eliminates much of the guesswork associated with decision-making processes. By providing factual insights, it serves as a reliable guide when it comes to making strategic choices. Whether it involves entering new markets, launching innovative products or investing in cutting-edge technology, businesses can rely on data-driven decision-making to reduce uncertainty and increase the likelihood of success. Armed with accurate information, entrepreneurs can make informed choices that align with their long-term objectives.

    Related: Leverage the Power of Data to Boost Your Sales — and Your Customer Connections

    How can you effectively harness the power of data analytics within your business?

    Embrace a data-driven culture:

    To embark on a successful data analytics journey, it is crucial to foster a data-driven culture within your organization. This entails training employees to understand and utilize data in their day-to-day work, encouraging them to base their decisions on concrete data rather than relying solely on intuition.

    Invest in the right tools:

    The market offers a wide array of data analytics tools, catering to various business sizes, industries and specific needs. From robust business intelligence platforms, such as Tableau and Power BI, to advanced machine learning tools, it is essential to carefully select the tools that align with your organization’s unique requirements.

    Hire or outsource expertise:

    Interpreting data and extracting meaningful insights necessitates specific skills. If your organization lacks in-house expertise, consider hiring data analysts or data scientists to fulfill these roles. Alternatively, you may choose to outsource your data analytics needs to specialized firms that possess the necessary knowledge and experience.

    Prioritize data privacy:

    In an era marked by frequent data breaches and privacy scandals, handling data responsibly is of paramount importance. It is crucial for businesses to ensure that their data practices comply with relevant regulations and industry standards. This includes implementing robust data privacy measures to protect sensitive information and maintaining transparency in how customer data is collected, stored and used. By prioritizing data privacy, businesses can build trust with their customers and safeguard their reputations.

    In conclusion, data analytics has the potential to be a game-changer for businesses in today’s information-driven landscape. By harnessing the power of data, organizations can gain valuable insights into customer behavior, optimize their operations, mitigate risks and make informed strategic decisions. However, reaping the benefits of data analytics requires a deliberate and strategic approach.

    It begins with embracing a data-driven culture within the organization, where employees are empowered to utilize data in their decision-making processes. Investing in the right data analytics tools is crucial, as it enables businesses to effectively collect, analyze and interpret data. Depending on the organization’s resources and expertise, hiring data analysts or outsourcing data analytics services may be necessary to extract meaningful insights from the data.

    Furthermore, businesses must prioritize data privacy and ensure compliance with relevant regulations. Protecting customer data and maintaining their trust is essential in the age of increasing privacy concerns. By adopting these practices, businesses can unlock the full potential of data analytics and drive growth, efficiency and innovation.

    Related: Using Data Analytics Will Transform Your Business. Here’s How.

    In today’s digital landscape, data is no longer just a byproduct of business operations. It has evolved into a valuable asset that holds the key to unlocking opportunities and staying ahead of the competition. Embracing data analytics is no longer an option but a necessity for businesses that aim to thrive in this dynamic and data-centric environment.

    So, seize the power of data analytics, and embark on a journey to transform your business. Embrace the insights that data can offer, streamline operations, enhance customer experiences, mitigate risks, and make informed decisions that propel your organization toward success. Remember, in the age of data, the possibilities are endless, and the businesses that effectively leverage data analytics will gain a significant competitive advantage in the marketplace.

    Aidan Sowa

    Source link