ReportWire

Tag: meta ai

  • How to Create Drone Shot Video Using Your Photo

    [ad_1]

    • If you are a Jio SIM user in India or a student, Google is currently offering the subscription for free as an annual trial.
    • This is because the output video adds the camera movement, as if it were captured from a drone.
    • Can I control the camera movement of drone shot in AI generated video.

    Footage captured by drones engages viewers with the perspective they offer. But using drones is not easy and also comes with a lot of logistical challenges. Video generation through artificial intelligence has really expanded the horizon with what all is possible. You can make footage, just like it was captured through a drone, with just a single image. We will test outputs from both free and paid platforms and compare their quality. We will see how they simulate camera movements, depth, and other details to make convincing footage.

    Testing the Free and Paid Methods

    Since this innovation emerged, many platforms have introduced this feature. We will be testing Meta AI, which is a platform by Meta Inc. This is a free method of generating videos through a single prompt. Next, we will test Veo 3 by Gemini, which is a Google platform. This feature requires at least a Google AI Plus subscription.

    Free Method: Meta AI

    Mark Zuckerberg’s Meta AI now lets you create videos from still images. We can use this feature to create some attractive drone shots.

    1. Open the official Meta AI website at meta.ai and login through an Instagram or Facebook account. Click the ‘+’ icon and select ‘Create’.

      select create

      2. Make sure ‘video’ is selected, then click the ‘+’ icon again to upload the image.

      select video and upload image

      3. After you have uploaded the image, add this prompt: “Create a drone shot, as if the drone is ascending into the sky. The shot should start from the provided image.” and hit ‘Animate’.

      enter prompt and hit animate

      Shortly, the video will be generated. You can add additional prompts to edit it, and you will also be able to download it.

      Paid Method: Gemini (Veo)

      Google’s Veo 3 is among the most advanced AI-powered video generation models. However, it requires a paid subscription of Gemini Pro to create meaningful videos. If you are a Jio SIM user in India or a student, Google is currently offering the subscription for free as an annual trial.

      1. After opening a new chat in Gemini, select ‘Create Videos (Veo 3.1)’ from the tools menu and click on the plus icon to upload your image.

          select create videos by veo 3.1

          2. Now, enter the same prompt: “Create a drone shot, as if the drone is ascending into the sky. The shot should start from the provided image.” and hit send.

          enter prompt and hit send

          The video will be generated in 1-2 minutes, and you can download it.

          Difference in the quality of output

          Meta AI: This produces good results if you want to try video generation for fun. This is because the output video adds the camera movement, as if it were captured from a drone. However, the details are not maintained satisfactorily. One would be able to make out that this is a generated video. However, as mentioned, it is ideal if you want to have fun and share such videos with your friends.

          meta video ouput

          Veo 3.1 (Gemini): Compared to Meta AI, Google’s Veo 3 produces a much better output compared to the free alternative. Apart from the camera movement, which is indeed smooth, the video has realistic depth, making it appear cinematic. The details have also been maintained better. This footage can be used for professional content creation. Also, background has been added to complement the video.

          gemini video output

          FAQs

          Q. Do I need a good-quality image to generate drone shot from image?

          It is better to upload a high-quality image to generate better output. AI will perform better if the image has good detail and clear foreground-background separation.

          Q. Can I control the camera movement of drone shot in AI generated video?

          To some extent, you can control it. It’s just that you can try giving a more detailed prompt to get results as per your specific needs.

          Wrapping Up

          Video generation platforms liberate creators to support their imagination, especially when they have limited access. In my opinion, this can be a good starting point and an aid to your content creation process. However, it should not be relied on entirely. And if we look at the results, Veo 3.1 edges ahead of Meta AI. But as you know, quality comes at a cost, and here it is, the Google AI Plus subscription.

          You may also like to read:

          Have any questions related to our how-to guides, or anything in the world of technology? Check out our new GadgetsToUse AI Chatbot for free, powered by ChatGPT.

          You can also follow us for instant tech news at Google News or for tips and tricks, smartphones & gadgets reviews, join the GadgetsToUse Telegram Group, or subscribe to the GadgetsToUse Youtube Channel for the latest review videos.

          Was this article helpful?

          YesNo

    [ad_2]

    Mitash Arora

    Source link

  • Nearly a third of American teens interact with AI chatbots daily, study finds

    [ad_1]

    New York (CNN) — Nearly a third of US teenagers say they use AI chatbots daily, a new study finds, shedding light on how young people are embracing a technology that’s raised critical safety concerns around mental health impacts and exposure to mature content for kids.

    The Pew Research Center study, which marks the group’s first time surveying teens on their general AI chatbot use, found that nearly 70% of American teens have used a chatbot at least once. And among those who use AI chatbots daily, 16% said they did so several times a day or “almost constantly.”

    AI chatbots have been pitched as learning and schoolwork tools for young people, but some teens have also turned to them for companionship or romantic relationships. That’s contributed to questions about whether young people should use chatbots in the first place. Some experts have worried that their use even in a learning context could stunt development.

    Pew surveyed nearly 1,500 US teens between the ages of 13 and 17 for the report, and the pool was designed to be representative across gender, age, race and ethnicity, and household income.

    ChatGPT was by far the most popular AI chatbot, with more than half of teens reporting having used it. The other top players were Google’s Gemini, Meta AI, Microsoft’s Copilot, Character.AI and Anthropic’s Claude, in that order.

    A nearly equal proportion of girls and boys — 64% and 63%, respectively — say they’ve used an AI chatbot. Teens ages 15 to 17 are slightly more likely (68%) to say they’ve used chatbots than those ages 13 to 14 (57%). And usage increases slightly as household income goes up, the survey found.

    Just shy of 70% of Black and Hispanic teens say they’ve used an AI chatbot, slightly higher than the 58% of White teens who say the same.

    The findings come after two of the major AI firms, OpenAI and Character.AI, have faced lawsuits from families who alleged the apps played a role in their teens’ suicides or mental health issues. OpenAI subsequently said it would roll out parental controls and age restrictions. And Character.AI has stopped allowing teens to engage in back-and-forth conversations with its AI-generated characters.

    Meta also came under fire earlier this year after reports emerged that its AI chatbot would engage in sexual conversations with minors. The company said it had updated its policies and next year will give parents the ability to block teens from chatting with AI characters on Instagram.

    At least one online safety group, Common Sense Media, has advised parents not to allow children under 18 to use companion-like AI chatbots, saying they pose “unacceptable risks” to young people.

    Some experts have also raised concerns that the use of AI for schoolwork could encourage cheating, although others say the technology can provide more personalized learning support.

    Meanwhile, AI companies have pushed to get their chatbots into schools. OpenAI, Microsoft and Anthropic have all rolled out tools for students and teachers. Earlier this year, the companies also partnered with teachers unions to launch an AI instruction academy for educators.

    Microsoft, in particular, has sought to position its Copilot as the safest choice for parents, with AI CEO Mustafa Suleyman telling CNN in October that it will never allow romantic or sexual conversations for adults or children.

    [ad_2]

    Clare Duffy and CNN

    Source link

  • Mark Zuckerberg’s Net Worth Drops As Meta’s AI Plan Spooks Investors

    [ad_1]

    Mark Zuckerberg fell to fifth place on the Bloomberg Billionaires Index — the lowest in nearly two years — as investors spooked by Meta Platforms Inc’s planned $30 billion debt sale sent the company’s shares spiraling amid a flurry of tech earnings shaking up the ranks of the world’s richest.

    Meta’s stock fell 11% — the most since 2022 — after the company said it was going to issue the biggest investment-grade bond offering of the year to boost spending on artificial intelligence research, dropping Zuckerberg’s net worth to $235.2 billion, according to the wealth index.

    He was leapfrogged by Amazon.com Inc.’s Jeff Bezos and Alphabet Inc.’s Larry Page, who hadn’t been among the four-richest people since October 2023. Alphabet’s shares climbed 2.5% after it reported revenue that beat analysts’ expectations amid a surge in demand for its cloud and AI services.

    READ: Mark Zuckerberg vs Mark Zuckerberg: The Legal Battle Over A Name

    Zuckerberg’s $29.2 billion drop was the fourth-largest one-day market-driven decline ever recorded by Bloomberg’s wealth index.

    Meta’s stock had gained 28% this year before Thursday’s swoon, adding $57 billion to Zuckerberg’s fortune. But doubts over Meta’s ballooning AI budget gave investors pause, with at least two analysts downgrading the company’s shares after it said it expected to spend up to $118 billion in capital expenditures this year and possibly more in 2026.

    Amazon shares have gained more than 30% since a mid-April low. Investors have cheered its cloud-computing unit, which has steadily grown as it has signed splashy deals with AI firms including Anthropic. The company reported third-quarter sales and profit that topped estimates, sending shares surging in after-hours trading. 
     

    (Except for the headline, this story has not been edited by NDTV staff and is published from a syndicated feed.)


    [ad_2]

    Source link

  • Oakley Meta Vanguard review: Sporty to a fault

    [ad_1]

    By now, I have a well-established routine when I set up a new pair of Meta smart glasses. I connect my Instagram, WhatsApp and Spotify accounts. I complete the slightly convoluted steps in my Bluetooth settings to make sure Meta AI can announce incoming phone calls and text messages. I tweak the video settings to the highest quality available, and change the voice of Meta AI to “English (UK)” so it can talk to me in the voice of Judi Dench.

    But with the $499 Oakley Meta Vanguard glasses, there’s also a new step: deciding what the customizable “action button” should do. The action button isn’t even my favorite part of using the glasses, but it’s a sign of just how different these shades are from the rest of Meta’s lineup.

    While the second-gen Ray-Ban and Oakley HSTN glasses iterated on the same formula Meta has used for the last few years, the Vanguard glasses are refreshingly different. They aren’t really meant to be everyday sunglasses (unless you’re really committed to your athletic pursuits) but they are in many ways more capable than Meta’s other smart glasses. The speakers are louder, the camera has new abilities and they integrate directly with Strava and Garmin. And while these won’t replace my go-to sunglasses, there’s more than enough to make them part of my fitness routine.

    Engadget

    Wraparound frames aren’t for everyone, but the new look enables some unique capabilities that will appeal to even casual athletes.

    Pros

    • Better battery life, speakers and durability than Meta’s other glasses
    • Redesigned camera makes photos and videos more usable
    • Action button means you can do more without saying “Hey Meta”
    Cons

    • Hyperlapse clips are kind of jittery
    • More third-party app integrations, please

    $499 at Meta

    New look, new setup

    The sunglasses were very clearly made with athletes in mind. The Oakley Meta Vanguard glasses are the type of shades a lot of people probably think of when they hear “Oakley sunglasses.” The wraparound frames with colorful, reflective lenses are the style of glasses you might associate with a high school track coach, or your neighbor who is really serious about cycling.

    The pair I tested had black frames and Oakley’s orange “Prizm 24K” lenses, which aren’t polarized but are favored by a lot of athletes for their ability to dial up the contrast of your surroundings. I was able to comfortably wear my pair in bright, sunny conditions and also in more overcast lower light. I also appreciate that the lenses are swappable, so you can switch them out for a dedicated low-light or different-colored lens depending on your conditions. (Extra lenses cost $85 each and will be available to purchase separately soon, according to Meta.) These glasses don’t, however, support prescription lenses of any kind.

    I wouldn't wear these as everyday sunglasses, but I don't mind the look for a trail run.

    I wouldn’t wear these as everyday sunglasses, but I don’t mind the look for a trail run. (Karissa Bell for Engadget)

    I realize this style of sunglasses won’t be appealing to everyone, but the frame shape does enable a slightly different setup than what we’ve seen with any of Meta’s other smart glasses. Most noticeably, the camera is in the center of the glasses, just above the nosebridge. The LED that lights up when the camera is on is also in the center, near the top of the frames.

    As with Meta’s other smart glasses, you can control volume and music playback via a touchpad on the right side of the glasses, but the capture button to take photos and videos is now on the underside of the glasses rather than on top. This is meant to make it a bit easier to reach if you’re wearing a hat or helmet, though I found it took me a few tries to get used to the new placement. Behind the capture button is the previously mentioned “action button,” which can be customized to trigger specific functions via the Meta AI app.

    The capture button (left) and the action button (right) are both on the underside of the frames rather than on top.

    The capture button (left) and the action button (right) are both on the underside of the frames rather than on top. (Karissa Bell for Engadget)

    I haven’t yet figured out what the best use for the action button is, though I’ve tried out a few different setups. On one hike, I set it up to automatically call my husband, kind of like a speed dial. During a bike ride, I had it set to record a hyperlapse video. I’ve also tried it out as a shortcut for launching a specific Spotify playlist or as a general trigger for Meta AI. With all of these, I appreciated that the action button allowed me to do something without saying the “Hey Meta,” command. Repeating “hey Meta” to my glasses in public has always felt a bit cringey, so it was nice to have a much more subtle cue available.

    Did I mention it’s for athletes?

    The Vanguard’s athlete-focused features go beyond the sportier frames. The shades come with new integrations for two of the most popular run and bike-tracking platforms: Garmin and Strava. If you have a supported Garmin watch or bike computer, you can set up the glasses to automatically capture video clips based on metrics from your activity, like hitting a particular heart rate zone or other milestone. You can also ask Meta AI directly to tell you about stats from your Garmin watch, like “hey Meta, what’s my pace.”

    I don’t have a Garmin watch, though I did briefly test out some of these features during my hands-on at Meta Connect. I suspect a lot of runners and cyclists may still find it easier to simply glance at their watch to see stats, but having it all available via voice commands doesn’t seem like a bad thing either.

    Strava’s integration isn’t quite as deep. If you’re tracking a run, hike or ride while wearing the glasses, you can overlay your stats directly onto photos and videos from your activity. This includes metrics like distance and elevation, as well as heart rate if you’re also wearing an Apple Watch or other tracker that’s connected to the Strava app. Here’s what it looks like with a photo from a recent bike ride.

    You can overlay your Strava stats onto the photos and videos you record.

    You can overlay your Strava stats onto the photos and videos you record. (Karissa Bell for Engadget)

    I typically don’t share stats from runs or bike rides (usually because they aren’t that impressive) but it’s a bit more appealing that just sharing a straight Strava screenshot. Another neat feature is that if you share a video, you can watch the stats change in real time alongside your recording. That level of detail isn’t particularly interesting for a mostly flat bike ride on a city street, but I can see how it would be a lot more compelling on a more technical trail ride or in a race.

    My only complaint, really, is that Meta has limited these kinds of features to Garmin and Strava’s platforms so far. I’d love to have support for my favorite ski-tracking app, Slopes, and I’m sure there are plenty of people who’d be happy to have an integration with their running or workout-tracking app of choice. Meta has announced some plans to bring more third-party apps onto its smart glasses platform so there might be hope here.

    There are other improvements, though, that will be appealing to even casual athletes. The speakers are a lot louder to account for potentially noisy conditions like a congested roadway or high-wind environment. I never had to crank the volume up anywhere near the max during my bike rides or runs, but I can say the speakers were loud and clear enough that I was able to comfortably listen to a podcast with the glasses laying next to me on the couch at full volume.

    The new centered camera placement is meant to make it harder for a hat or helmet to interfere with your shots, which has been a consistent issue for me with Meta’s other smart glasses. The new position didn’t totally solve this — I still found that my bike helmet made it into the top of my pics — but at least it’s easier to crop out now that my headgear is centered over the top of my image rather than awkwardly sticking out on one side.

    The 12MP ultra-wide camera also comes with new video stabilization settings that make it feel a bit more like a replacement for an action cam. The glasses are set to automatically select a level of stabilization based on your motion, but you can also manually choose between low, medium or high stabilization (stabilization is locked at “medium” if you opt to record in 3K). I’ve mostly left it with the default settings and have been impressed with the results.

    The LED light is also a bit more subtle than on Meta's other smart glasses.

    The LED light is also a bit more subtle than on Meta’s other smart glasses. (Karissa Bell for Engadget)

    The Vanguard glasses are also Meta’s first smart glasses that can record hyperlapse and slow-motion video. Hyperlapse should be familiar to Instagram users who used the now-defunct app of the same name to record timelapse clips. Now, you can say “Hey Meta, start a hyperlapse” and the glasses will record a similar sped-up clip. My hyperlapse clips ended up looking a bit jittery, though, compared to the timelapse shots I’m used to getting with my GoPro.  And unfortunately, there’s no way to adjust the cadence of the video like you used to be able to with the dedicated app.

    My slow-motion clips, on the other hand, came out better. It’s not something I’d expect to use very often during a bike ride or trail run, but the POV angle is great for recording clips of pets or kids. Meta is also planning to bring support for hyperlapse and slow-motion videos to the rest of its glasses lineup, though, so you don’t need to get these particular shades to take advantage of the feature.

    The other major improvement is battery life. The Vanguard glasses have a notably better battery life compared with the second-gen Ray-Ban glasses or the HSTN frames (probably because the bigger frames allow for a larger battery). According to Meta, the Vanguard glasses can go nine hours on a charge with “typical use” or six hours with continuous audio playback. I was actually able to get a little over six hours of audio on a single charge, so they should hold up pretty well if you’re running marathons or competing in longer races. As usual, exact battery life can vary a lot depending on how much you’re using more resource-intensive features like video recording or Meta AI.

    The bigger frames and charging case give the glasses a battery life boost.

    The bigger frames and charging case give the glasses a battery life boost. (Karissa Bell for Engadget)

    I’m especially looking forward to seeing how these glasses will hold up during a day of snowboarding. Meta previously told me that the battery has been optimized for a wider spectrum of temperatures so hopefully the battery won’t drain as quickly on the mountain as Meta’s other glasses. And with increased water resistance — the shades have an IP67 rating —  I wouldn’t worry about dropping them in the snow.

    Should you buy these?

    While Meta and EssilorLuxottica have gotten very good at making smart glasses (sorry Mark Zuckerberg, I won’t call them “AI glasses,”) they are still somewhat of a niche product. And the ultra-sporty Oakley Vanguard glasses are even more niche. At $499, these are also more expensive than other models.

    That, understandably, may feel too steep for a pair of sunglasses you’re likely only going to wear during specific activities. But if you’re a dedicated cyclist, runner, hiker or [insert outdoor activity of your choice], there’s a lot to like. The camera makes a lot more sense for action cam-like POV footage, and better video stabilization means you’re more likely to get shots you actually want to share. Ready-made Garmin and Strava integrations are practically begging for you to brag about your latest PR or race time, which will certainly appeal to many.

    [ad_2]

    Source link

  • Meta AI’s app downloads and daily users spiked after launch of ‘Vibes’ AI video feed | TechCrunch

    [ad_1]

    New data indicates that use of Meta AI’s mobile app for iOS and Android has seen a significant increase. According to a new analysis from market intelligence provider Similarweb, the app’s daily active users across both platforms jumped to 2.7 million as of October 17, up from around 775,000 just four weeks ago. In addition, Meta AI’s app installs are also up, reaching 300,000 new downloads per day, compared with under 200,000 daily downloads a few weeks ago.

    For comparison, Meta AI’s app had just 4,000 daily downloads a year ago, on October 17, 2024.

    Image Credits:Similarweb

    The firm says it hasn’t seen any meaningful correlation in either search or advertising estimates, but notes Meta could be running Facebook or Instagram promotions that wouldn’t be captured in its model.

    However, there’s also another possible explanation for the sharp rise: the launch of Meta’s new Vibes feed in September, which introduced short-form AI-generated videos to the Meta AI mobile app.

    Meta AI introduced the Vibes feed on September 25, which correlates with the sharp increase in the app’s daily active users on iOS and Android, as seen in the chart below.

    Image Credits:Similarweb

    Recently, OpenAI’s video generator Sora drew headlines as its app reached the top of the App Store when users rushed to try the new technology. However, Meta AI could have benefited from this launch as well. While Similarweb says its data doesn’t prove cause and effect, it’s possible that the attention to Sora drove some people to try Meta AI, in order to compare the two experiences.

    Another possibility is that Meta could be benefiting from Sora’s invite-only status. That is, those who couldn’t try out the OpenAI app may have looked for an alternative to experiment with. This would be an interesting explanation, too, as it suggests that OpenAI’s decision to gatekeep Sora may have directly boosted its rivals.

    Techcrunch event

    San Francisco
    |
    October 27-29, 2025

    Meta AI's Vibes feed, which showcases AI videos, may have driven a spike in app downloads and usage,
    Image Credits:Similarweb

    As of October 17, Meta AI’s app had seen a 15.58% increase in daily active users worldwide, while ChatGPT, Grok, and Perplexity saw declines of 3.51%, 7.35%, and 2.29%, respectively.

    [ad_2]

    Sarah Perez

    Source link

  • Meta previews new parental controls for its AI experiences | TechCrunch

    [ad_1]

    Meta on Friday previewed its upcoming parental control features for teens’ conversations with AI characters on its platforms. The features, which will be rolled out next year, include the ability to block certain characters and monitor conversation topics.

    Starting in the coming months, parents will be able to turn off chats with AI characters entirely for teens. This action won’t block access to the Meta AI chatbot — the company’s general-purpose AI chatbot — which will only discuss age-appropriate content.

    Parents will also be able to turn off chats with individual characters if they prefer more selective control. Plus, they will receive information about the topics teens are discussing with AI characters and Meta AI.

    The company said it plans to roll out these controls on Instagram early next year. They will be available in English in the U.S., U.K., Canada, and Australia.

    “We recognize parents already have a lot on their plates when it comes to navigating the internet safely with their teens, and we’re committed to providing them with helpful tools and resources that make things simpler for them, especially as they think about new technology like AI,” the company said in a post written by Instagram head Adam Mosseri and newly appointed Meta AI head Alexandr Wang.

    Earlier this week, Meta said that its content and AI experiences for teens will follow a PG-13 movie rating standard and will avoid sensitive topics such as extreme violence, nudity, and graphic drug use.

    The company added that currently, teens are only allowed to interact with a limited number of characters that follow age-appropriate content guidelines. Parents can also set time limits on teens’ interactions with AI characters. Earlier this year, Instagram announced that it is using AI to identify attempting to skirt age limits by faking their age on the app.

    Techcrunch event

    San Francisco
    |
    October 27-29, 2025

    In the past few weeks, multiple platforms, including OpenAI, Meta, and YouTube, have released tools and controls focused on teen safety. These changes come amid growing concerns about the impact of social media on teen mental health and lawsuits against AI companies that allege they played a part in teen suicides.

    [ad_2]

    Ivan Mehta

    Source link

  • Meta adds Hindi and Portuguese support for its AI translation feature for Reels | TechCrunch

    [ad_1]

    Meta is already putting Reels in front and center of the Instagram experience on iPad and on mobile in countries like India and South Korea. Now, to allow more people to consume and understand Reels in different languages, the company is adding support for Hindi and Portuguese to its AI-powered translation feature across Instagram and Facebook, as well.

    The company first launched this feature in August with support for English and Spanish after teasing it at its Meta Connect conference last year. With Hindi and Portuguese support, Meta aims to tap into creators from its biggest markets, like India and Brazil, to have a global reach.

    “We believe there are lots of amazing creators out there who have potential audiences who don’t necessarily speak the same language,” explained Instagram head Adam Mosseri in a post on Instagram in August.

    “And if we can help you reach those audiences who speak other languages, reach across cultural and linguistic barriers, we can help you grow your following and get more value out of Instagram and the platform,” he added.

    Image Credits: Meta

    Users can turn on automatic translation in their preferred language to consume Reels that were originally created in another language.

    Creators can turn on automatic translation for their Reels by turning on the “Translate your voice with Meta AI” option before publishing. They can also select which languages to translate to after reviewing the translated video with automatic dub and lip sync.

    Screenshots of  Facebook interface for creators that shows screens to view, review, and approve AI-powered translations of their review
    Image Credits: Meta

    Meta said that it is working on new features for AI-powered translation that will roll out soon. Reels on Facebook already support multi-speaker AI-powered translations. Meta said that this feature will be available for creators posting on Instagram soon.

    Techcrunch event

    San Francisco
    |
    October 27-29, 2025

    Apart from voice translations, the company is also building a way to translate text or caption stickers posted with Reels in supported languages. Users will be able to select the “Translate text on Reels” option when the feature is rolled out. This is useful for someone who is watching a video without sound.

    Screenshots of a menu on Instagram interface to control AI-powered translation for Reels. Users can turn on/off translations for voice and text
    Image Credits: meta

    What’s more, Meta plans to roll out a new voice dubbing feature that will preserve the creator’s original voice and tone. The company will also give an option to opt for a new version of lip-syncing that better matches the movement of your mouth.

    Meta’s rival YouTube has been working on translation features for a few years now. Last month, the company rolled out lip-sync improvements to its auto-dubbing feature with support for 20 languages.

    [ad_2]

    Ivan Mehta

    Source link

  • How to Create AI Videos For Insta Stories Using Meta AI

    [ad_1]

    • Yes, you can either save and then post it as a reel, or you can directly add the AI video to your Instagram story.
    • Now, as you can see in the final video, the details are a bit blurry, and the movement is not that precise.
    • You have to pay a decent amount of money to get thier access, and while Meta is free, the result it is producing is not great, but not bad either.

    Meta is rolling out new updates, and this time, the latest addition is in the Meta App itself. With this new update, you will be able to create some super cool AI-generated videos and also share them for engagement. The Meta app also has a separate feed where you can see videos from other creators. You can now directly share or create these videos from Instagram, and the process is seamless. If you are wondering just how good the videos are, then rest assured. This article will tell you how you can create your own AI videos and whether they are actually good.

    Use AI to Create Instagram Story Videos

    In the coming years, I can see AI agents and AI avatars replacing human creators. Meta is a perfect case study. First, they embedded AI chatbots onto their platforms, then they introduced AI chatbots, and now they have opened the channels for everyone to create their own AI videos and post them for engagement. However, the quality of the said videos is not that great if you compare them with other video generators such as Sora 2 or HeyGen. However, both Sora and HeyGen are not that accessible to the general audience; you have to pay a decent amount of money to get thier access, and while Meta is free, the result it is producing is not great, but not bad either. You can create some fun videos, but if you are looking to get hyper-realistic images, then you are shaking the wrong tree.

    How you can Create Your Very Own AI Videos

    This is a fairly simple process. All you need to do is head over to the Story section of Instagram and then follow the steps mentioned below.

    1. Swipe right and tap on AI Images.

    AI IMages option

    2. Then you will land on a preview page, and on that page tap on the Blue Try It button.

    Try It button

    3. Doing this will land you on the Meta App and then tap on the ‘+’ icon on the top right.

    Video creation button

    4. Enter your prompt into the text box below and then tap on the Blue arrow on the bottom right.

    Prompt and Enter button

    5. You will get four images, and if you want to convert them into a video, then tap on the image of your choice.

    Images generated

    6. From the extended menu, click on the Animate button. Enter the prompt for your video.

    Animate Option

    7. Once it is done processing, you can add music to the video. Simply tap on the Music icon.

    8. You also get other options like Edit, Restyle, and Extend.

    Extend and other options

    9. To share the video, click on the icon on the top right and choose your preferred option.

    Share option

    Now, as you can see in the final video, the details are a bit blurry, and the movement is not that precise. However, if you prompt it correctly and do some editing, you will have a better result. Also, for a feature that is still in development, it is quite good and will keep on getting better with updates.

    FAQs

    Q. How many videos in a day can we create using the Meta app?

    As of now, there is no limit on the video generation feature. I have generated more than five videos in a day and also edited them.

    Q. Can I share the AI videos directly on my Instagram Story?

    Yes, you can either save and then post it as a reel, or you can directly add the AI video to your Instagram story. Simply tap on the Share button and then choose the option to Add to Instagram Story.

    Wrapping Up

    This article covers the new video generation feature of the Meta App. Meta is trying to build its own ecosystem of AI platforms and use cases. There is a dedicated feed in the Meta app that allows you to see the creations of other creators. With the right updates, this feature can be the next big thing for Meta.

    You may also like to read:

    Have any questions related to our how-to guides, or anything in the world of technology? Check out our new GadgetsToUse AI Chatbot for free, powered by ChatGPT.

    You can also follow us for instant tech news at Google News or for tips and tricks, smartphones & gadgets reviews, join the GadgetsToUse Telegram Group, or subscribe to the GadgetsToUse Youtube Channel for the latest review videos.

    Was this article helpful?

    YesNo

    [ad_2]

    Dev Chaudhary

    Source link

  • Meta plans to sell targeted ads based on data in your AI chats | TechCrunch

    [ad_1]

    Meta announced on Wednesday that data collected from user interactions with its AI products will soon be used to sell targeted ads across its social media platforms.

    The company will update its privacy policy by December 16 to reflect the change, and will notify users in the coming days. The new policy applies globally, except for users in South Korea, the United Kingdom, and the European Union, where privacy laws prevent this type of data collection.

    Meta’s core business has long relied on building detailed profiles of Facebook and Instagram users to sell hyper-targeted ads. The company offers advertisers a way to reach specific demographics and user groups. Now, Meta will also use data from conversations with its AI chatbot to build out those profiles, giving it another powerful signal to target its ads.

    The social media giant already has lots of information about its users, but Meta AI has created a rich new stream of information. The company says more than a billion people chat with Meta AI every month, and it’s common for users to hold long, detailed conversations with the AI chatbot. So far, Meta has largely given away its AI products for free, but now the company can improve its valuable ad products based on the data it collects.

    If a user chats with Meta AI about hiking, for example, the company may show ads for hiking gear. However, Meta spokesperson Emil Vazquez tells TechCrunch that the privacy update is broader than just Meta AI, and applies to the company’s other AI offerings.

    That means Meta may use data from AI features in its Ray-Ban Meta smart glasses — including voice recordings, pictures, and videos analyzed with AI — to further target its ad products. Meta may also use data from its new AI-video feed, Vibes, and its AI image generation product, Imagine.

    Conversations with Meta AI will only influence ads on Facebook and Instagram if a user is logged into the same account across products.

    Techcrunch event

    San Francisco
    |
    October 27-29, 2025

    There is no way to opt out, according to Meta.

    The privacy changes are another reminder that free products from Big Tech companies often come with strings attached. Many tech companies already use AI interactions to train their models. Meta, for instance, trains on voice recordings, photos, and videos analyzed through Meta AI on its smart glasses. Now it will also feed that data into its ad machine.

    In a briefing with reporters, Meta privacy policy manager Christy Harris said the company is still in the process of building out systems that will use AI interactions to improve its ad products. However, the company says user conversations with AI around sensitive topics — including religious views, sexual orientation, political views, health, racial or ethnic origin, philosophical beliefs, or trade union membership — will not be used to show them ads.

    Tech companies are starting to test out ways to monetize AI products, most of which are free today. On Monday, OpenAI unveiled a way to purchase products in ChatGPT, where the company will take a cut of transactions completed in the app. Earlier this year, Google revealed plans for how it would introduce ads into its AI-powered search product, called AI Mode.

    Meta says the company has “no plans imminently” to put ads in its AI products, though CEO Mark Zuckerberg has suggested they may be coming in the future.

    [ad_2]

    Maxwell Zeff

    Source link

  • Ray-Ban Meta (2nd Gen) review: Smart glasses are finally getting useful

    [ad_1]

    In a lot of ways, Meta’s hasn’t changed much with its second-gen Ray-Ban glasses. The latest model has the same design and largely the same specs as the originals, with two important upgrades: longer battery life and improved video quality. 

    At the same time, the Ray-Ban Meta glasses have a lot of features that didn’t exist when I first reviewed them two years ago, largely thanks to AI. And with the release of its second-generation frames, there’s still a lot to look forward to, like new camera features and AI-powered audio. The good news is that Meta isn’t limiting these updates to its newest frames, so if you have an older pair you’ll still see the new features. But, if you’ve been on the fence about getting a pair, there’s never been a better time to jump in. 

    Engadget

    Meta’s second-generation smart glasses are becoming a genuinely useful accessory.

    Pros

    • Noticeably better battery life
    • YouTuber-friendly 3K video
    • Meta AI translations are a game-changer for travel
    Cons

    • Framing POV photos and video is still a challenge
    • Pricey lens upgrades

    $379 at Meta

    Same look, (slightly) better specs

    Meta and EssilorLuxottica haven’t strayed too far from the playbook they’ve used for the last two years. The second-generation Ray-Ban Meta glasses come in a handful of frame styles with a number of color and lens variations that start at $379. I tried out a pair of Wayfarer frames in the new “shiny cosmic blue” color with clear transition lenses. 

    I personally prefer the look for the slightly narrower Headliner frames, but the second-gen glasses still look very much like traditional Wayfarer glasses. I’ve never been a fan of transition lenses for my own prescription eyewear, but I’m starting to come around on them for smart glasses. As Meta has improved its cameras and made its AI assistant more useful, I’ve found more reasons to wear the glasses indoors. 

    The second-generation Ray-Ban Meta glasses come with clear frames, with polarized and transition lenses available as an upgrade.

    The second-generation Ray-Ban Meta glasses come with clear frames, with polarized and transition lenses available as an upgrade.

    (Karissa Bell for Engadget)

    Also, if you’re going to be paying $300 or more for a pair, you might as well be able to use them wherever you are. It also helps that the transition lenses on the second-gen Ray-Ban Meta glasses get a bit darker than my first-gen Wayfarers with transition lenses. Upgrading from the standard clear lenses will cost you, though. Frames with polarized lenses start at $409, transitions start at $459 and prescription lenses can run significantly more. 

    As with the recent Oakley Meta HSTN glasses, the second-gen Ray-Bans come with a longer battery life and better camera. Meta says the battery can last up to eight hours on a single charge with “typical use.” I was able to squeeze a little more than five and a half hours of continuous music playback. That’s a noticeable step up from the battery on my original pair which, after two years, is starting to show its age. The glasses also now support higher-resolution 3K video recording, but the 12MP wide-angle lens shoots the same 3,024 x 4,032 pixel portrait photos as earlier models.

    The second-gen glasses have the same design as the first-gen, with a capture button on the right side of the frames.

    The second-gen glasses have the same design as the first-gen, with a capture button on the right side of the frames. The charging case provides an additional 48 hours of battery life.

    (Karissa Bell for Engadget)

    For videos, there’s a noticeable quality boost, but I still think it’s probably not necessary for most people if you’re primarily sharing your clips on social media. It does make the glasses more appealing for creators, though, and judging by the number of them in attendance at Connect, I suspect Meta sees them as a significant part of its user base. I’m looking forward to Meta adding the ability to record Hyperlapse and slow-motion videos, though, as I think these may be more interesting than the standard POV footage for everyday activities. 

    Meta AI + what’s coming

    Two years ago, I was fairly skeptical of Meta’s AI assistant. But since then, Meta has steadily added new capabilities. Of those, the glasses’ translation abilities have been my favorite. On a recent trip to Argentina, I used live translation to follow along with a walking tour of the famous Recoleta cemetery. It wasn’t perfect — the feature is meant more for back-and-forth conversations rather than extended monologues — but it allowed me to participate in a tour I would have otherwise had to skip. (A word of warning: using the live translation for an extended period of time is a major battery killer.)

    Meta AI can also provide context and translations in other scenarios, too. I spent some time in Germany while testing the latest second-gen Ray-Ban glasses and found myself repeatedly asking Meta to translate signs and notices. For example, here’s how Meta AI summarized this collection of signs. 

    Meta AI was able to translate these signs (left) when I asked it "what do these signs say?"

    Meta AI was able to translate these signs (left) when I asked it “what do these signs say?”

    (Karissa Bell for Engadget)

    As I wrote in my review of the Oakley Meta HSTN glasses, I still haven’t found much use for Live AI, which lets you interact with the assistant in real-time and ask questions about your surroundings. It still feels like more of a novelty, but it makes for a fun demo to show off to friends who have never tried “AI glasses.” There are also some very interesting accessibility use cases that take advantage of the glasses’ cameras and AI capabilities. Features like “detailed responses” and support for “Be My Eyes” show how smart glasses can be particularly impactful for people who are blind or deal with low vision.

    One AI-powered feature I haven’t tried out yet is Conversation Focus, which can adjust the volume of the person you’re speaking to while dampening the background noise. Meta teased the feature at Connect, but hasn’t said exactly when it will be available. But if it works as intended, I could see it being useful in a lot of scenarios.

    I’m also particularly intrigued by Meta’s Connect announcement that it will finally allow third-party developers to create their own integrations for its smart glasses. There are already a handful of partners, like Twitch and Disney, which are finding ways to take advantage of the glasses’ camera and AI features. Up to now, Meta AI’s multimodal tools have shown some promise, but I haven’t really been able to find many ways to use the capabilities in my day-to-day life. 

    Allowing app makers onto the platform could change that. Disney has previewed a smart glasses integration for inside of its parks that would allow visitors to get real-time info about the rides, attractions and other amenities as they walk around. Golf app 18Birdies has shown off an app to deliver stats and other info while you’re on the course.

    Should you buy these? And what about privacy?

    When the Ray-Ban Meta glasses came out two years ago, this was a pretty straightforward question to answer. If the idea of smart glasses with a good camera and open-ear speakers appealed to you, then buying a pair was a no-brainer. 

    Now, it’s a bit more complicated. Meta is still updating its first-gen Ray-Ban glasses with significant new features, like Conversation Focus, new camera modes and third-party app integrations. So if you already have a pair, you won’t be missing out on a ton if you don’t upgrade. (And with a starting price of $299, the first-gen glasses are still solid if you want a more budget-friendly option.)

    There are also other options to consider. The upcoming Oakley Meta Vanguard glasses come with more substantial hardware upgrades and other unique features that will appeal to athletes and anyone who spends a lot of time outdoors. And on the higher end, there are the $799 Meta Ray-Ban Display glasses that blend AR elements with its existing features in an intriguing way. 

    Meta has already previewed several new features, like new camera modes and Conversation Focus.

    Meta has already previewed several new features, like new camera modes and Conversation Focus.

    (Karissa Bell for Engadget)

    I also have many of the same concerns about privacy as I did when I reviewed Meta’s first Ray-Ban branded glasses back in 2021. I’m well aware Meta already collects an extraordinary amount of data about us through its apps, but glasses just feel like they provide much more personal, and potentially invasive, access to our lives.

    Meta has also made some notable changes to the privacy policy for its glasses in recent months. It no longer allows users in the United States to opt out of storing voice recordings in its cloud, though it’s still possible to manually delete recordings in the Meta AI app. 

    The company says it won’t use the contents of the photos and videos you capture to train its AI models or serve ads. However, images of your surroundings processed for the glasses’ multimodal features like Live AI can be used for training purposes (these images aren’t saved to your device’s camera roll). Meta’s privacy policy also states that it uses audio captured via voice commands for training. And it should go without saying, but anyone using Meta’s glasses should be very careful about sharing their interactions with its AI app, as a bunch of users have already seemingly inadvertently shared a ton of highly-personal interactions with the world. 

    If any of that makes you uncomfortable, I’m not here to convince you otherwise! We’re still grappling with the long-term privacy implications of generative AI, much less generative AI on camera-enabled wearables. At the same time, as someone who has been wearing Meta’s smart glasses on and off for more than four years, I can say that Meta has been able to turn something that once felt gimmicky into a genuinely useful accessory. 

    [ad_2]

    Source link

  • What to expect at Meta Connect 2025: ‘Hypernova’ smart glasses, AI and the metaverse

    [ad_1]

    Meta Connect, the company’s annual event dedicated to all things AR, VR, AI and the metaverse is just days away. And once again, it seems like it will be a big year for smart glasses and AI.

    This year, the event will take a slightly different format than in the past. Mark Zuckerberg is to kick things off with an evening keynote at 5PM PT on Wednesday, September 17. A developer keynote with other executives will take place the next morning on September 18, beginning at 10AM, with more talks and developer sessions to follow.

    It’s not clear why Meta changed things up this year, but it is shaping up to be a particularly eventful year for Connect. We’re expecting two new models of smart glasses, including Meta’s first to have a display, as well as new Meta AI and metaverse updates. As usual, Engadget will be reporting live from Zuckerberg’s keynote at Meta HQ, but until then, here’s a closer look at what’s coming and what to keep an eye on.

    New and updated smart glasses

    The biggest news of the day will be Meta’s next-generation of smart glasses. The frames, often referred to by their reported internal name “,” will be the first consumer-ready glasses from Meta that have a display. We already know quite a bit about these thanks to more than a year of leaks.

    While the frames are expected to have a small display on one side, they won’t offer the kind of immersive augmented reality experience we’ve seen on Meta’s . Instead the display will allow you to view things like notifications and photo previews. The glasses will also come with a dedicated wristband, similar to what the company showed off with Orion, that allows the wearer to control specific features through hand gestures.

    The EMG wristband that’s part of the Orion prototype.

    (Karissa Bell for Engadget)

    The glasses, which may officially be called “Celeste,” are expected to go on sale later this year, will likely cost around $800. They could be sold with Prada branding, which would be in line with Meta’s longtime EssilorLuxottica partnership, CNBC. Given the much higher price tag — most of Meta’s Ray-Ban-branded glasses cost around $300 — it seems Meta is positioning this as a higher-end product that will have a more limited appeal. Analyst Ming-Chi Kuo has Hypernova will have a “negligible” share of the overall smart glasses market.

    It also sounds like we could see a new version of Meta’s smart glasses without a display with an updated version of the Ray-Ban Meta smart glasses. There could be two versions for sunglasses and clear frames, according to . The new glasses are reported to have improved cameras and battery life, and support new AI capabilities.

    We could also see new third-party glasses integrations. As UploadVR recently early versions of the Connect schedule for developers seemingly confirms that Meta is getting ready to give developers access to its smart glasses. Up to now, the Ray-Ban Meta and Oakley glasses have mostly been limited to apps within Meta’s ecosystem (with a few exceptions like Spotify and Audible). Allowing more developers to start experimenting with the platform could bring even more functionality to the existing lineup of glasses.

    Meta AI

    As with other recent years, AI will be a major theme throughout. Meta AI has monthly users (something Zuckerberg will surely remind us of) and I’m expecting to see new features for Meta AI both on the company’s glasses and within its apps. Business Insider the company has been working on new lineup of non-English speaking “character-driven” bots for its apps. (Meta’s character-centric chatbots have also faced scrutiny, with the company recently blocking teens’ access to many user-generated characters amid growing safety concerns.)

    Outside of Meta’s chatbots, I’m hoping Zuckerberg will talk more about his vision to create “superintelligence.” As I wrote in July, his that outlined his vision was confusing at best. The CEO has recently reorganized Meta’s AI teams around the idea, and has been on a very expensive to recruit executives and researchers for the effort.

    At the same time, Zuckerberg could use Connect to shore up expectations around its Llama models. The company’s larger Llama 4 model has and reports suggest Meta’s engineers have been struggling to improve it. There are other signs that Zuckerberg may be from open-source AI.

    What about the metaverse?

    While the metaverse has taken somewhat of a backseat to AI in recent years, it wouldn’t be Connect without some VR-related news. In a recent Instagram post, Meta CTO Andrew Bosworth teased “metaverse software” updates related to Horizon Worlds at Connect. The company recently offered to developers of its plan to bring AI-powered NPCs to the metaverse, and I expect we’ll hear more about how generative AI could help shape the metaverse.

    And while there are no new Quest headsets expected, we could hear more about those third-party VR headsets that will run Meta’s VR software. Last year, the company announced that were working on Meta Horizon OS headsets. We haven’t heard too much about these devices since, but there was this year that suggested ASUS would be the first to launch, and that it would include face and eye tracking features.

    Another intriguing possibility is an update on Meta’s holographic Codec avatars we got of last year. While Meta’s current lineup of VR headsets don’t have the necessary face and eye-tracking sensors to support the tech, UploadVR Meta could show off a more “rudimentary” version of the avatars that could run on the Quest 3 or even work in conjunction with video calls on WhatsApp and Messenger.

    [ad_2]

    Karissa Bell

    Source link

  • Meta is re-training its AI so it won’t discuss self-harm or have romantic conversations with teens

    [ad_1]

    Meta is re-training its AI and adding new protections to keep teen users from discussing harmful topics with the company’s chatbots. The company says it’s adding new “guardrails as an extra precaution” to prevent teens from discussing self harm, disordered eating and suicide with Meta AI. Meta will also stop teens from accessing user-generated chatbot characters that might engage in inappropriate conversations.

    The changes, which were first reported by TechCrunch, come after numerous reports have called attention to alarming interactions between Meta AI and teens. Earlier this month, Reuters reported on an internal Meta policy document that said the company’s AI chatbots were permitted to have “sensual” conversations with underage users. Meta later said that language was “erroneous and inconsistent with our policies” and had been removed. Yesterday, The Washington Post reported on a study that found Meta AI was able to “coach teen accounts on suicide, self-harm and eating disorders.”

    Meta is now stepping up its internal “guardrails” so those types of interactions should no longer be possible for teens on Instagram and Facebook. “We built protections for teens into our AI products from the start, including designing them to respond safely to prompts about self-harm, suicide, and disordered eating,” Meta spokesperson Stephanie Otway told Engadget in a statement.

    “As our community grows and technology evolves, we’re continually learning about how young people may interact with these tools and strengthening our protections accordingly. As we continue to refine our systems, we’re adding more guardrails as an extra precaution — including training our AIs not to engage with teens on these topics, but to guide them to expert resources, and limiting teen access to a select group of AI characters for now.”

    Notably, the new protections are described as being in place “for now,” as Meta is apparently still working on more permanent measures to address growing concerns around teen safety and its AI. “These updates are already in progress, and we will continue to adapt our approach to help ensure teens have safe, age-appropriate experiences with AI,” Otway said. The new protections will be rolling out over the next few weeks and apply to all teen users using Meta AI in English-speaking countries.

    Meta’s policies have also caught the attention of lawmakers and other officials, with Senator Josh Hawley recently telling the company he planned to launch an investigation over its handling of such interactions. Texas Attorney General Ken Paxton has also indicated he wants to investigate Meta for allegedly misleading children about mental health claims made by its chatbots.

    [ad_2]

    Karissa Bell

    Source link

  • Meta changes its label from ‘Made with AI’ to ‘AI info’ to indicate use of AI in photos | TechCrunch

    Meta changes its label from ‘Made with AI’ to ‘AI info’ to indicate use of AI in photos | TechCrunch

    [ad_1]

    After Meta started tagging photos with a “Made with AI” label in May, photographers complained that the social networking company had been applying labels to real photos where they had used some basic editing tools.

    Because of the user feedback and general confusion around what level of AI is used in a photo, the company is changing the tag to “AI Info” across all of Meta’s apps.

    Meta said that the earlier version of the tag wasn’t clear enough for users to indicate that the image with the tag is not neccesarily created with AI, but might have used AI-powered tools in the editing process.

    “Like others across the industry, we’ve found that our labels based on these indicators weren’t always aligned with people’s expectations and didn’t always provide enough context. For example, some content that included minor modifications using AI, such as retouching tools, included industry standard indicators that were then labeled ‘Made with AI’,” the company said in an updated blog post.

    Image Credits: Meta

    The company is not changing the underlying technology for detecting use of AI in photos and labeling them. Meta still uses information from technical metadata standards such as C2PA and IPTC that include information about use of AI tools.

    That means, if photographers use tools like Adobe’s Generative AI Fill to remove objects, their photos might still be tagged with the new label. However, Meta hopes that the new label will help people understand that the image with the tag is not always created entierly by AI.

    “‘AI Info’ can encompass content that was made and/or modified with AI so the hope is that this is more in line with people’s expectations, while we work with companies across the industry to improve the process,” Meta spokesperson Kate McLaughlin told TechCrunch over email.

    The new tag will still not solve the problem of completely AI-generated photos going undetected. And it won’t tell users about how much AI-powered editing has been done on an image.

    Meta and other social network will need to work to set guidelines without being unfair to photographers who have not made alterations to their editing workflows, but the tools they used to touch up photos have some generative AI element. On the other hand, companies like Adobe should warn photographers that when they use a certain tool, their image might be tagged with a label on other services.

    [ad_2]

    Ivan Mehta

    Source link

  • Do you believe in job after job? | TechCrunch

    Do you believe in job after job? | TechCrunch

    [ad_1]

    People moving on to new jobs is not a bad thing — and not only when they have been laid off. That’s why it’s uplifting to see employers encourage this process.

    © 2023 TechCrunch. All rights reserved. For personal use only.

    [ad_2]

    Anna Heim

    Source link