Latest Sectors News
Shift to solar a likely windfall for China, the world’s largest provider of solar technology View More
Beijing has alleged that India’s tariff structure and domestic preference policies discriminate against Chinese products and violate WTO agreements related to trade, subsidies and investment measures. View More
ACME Solar shares have soared 21% YTD despite market volatility caused by geopolitical tensions and economic shifts. Analysts from HSBC and Motilal Oswal foresee further growth, citing the company’s strategic shift and capacity expansion plans. View More
India Inc saw major wins across sectors as Rossell Techsys expanded capacity, Bharat Forge entered Embraer’s aerospace supply chain, HFCL secured export orders, and firms won large infra, energy, and global engineering contracts View More
Before joining Redwood Materials, Ahuja served as chief financial and business officer at the drone delivery startup Zipline for about three years. View More
Deepak AhujaSource: Verily Redwood Materials, the electric vehicle battery recycling business started by Tesla board member and former CTO JB Straubel, is bringing on another former Tesla executive, Deepak Ahuja, as CFO, the company announced Monday.Ahuja served as finance chief at Tesla from March 2017 to March 2019, his second term at Elon Musk's EV and clean energy company. He first joined Tesla in 2008, navigated it through an IPO in 2010, briefly resigned in 2015 and was recruited back two years later.Ahuja told CNBC that his relationship with Straubel primarily influenced his decision to join the recycling startup. "Knowing JB for the last 18 years, I have huge respect for him as a leader, an engineer and as a thinker. And knowing so many of the leadership team who are from Tesla makes it easier for me to step in with a sense of credibility and build the business," he said. "There are different business models, different areas of growth and capital allocation, that it's still going to be a learning experience for me." Straubel had originally started Redwood Materials in 2017, running it while concurrently serving as Tesla CTO until July 2019. The Carson City, Nevada-based startup has raised over $2.3 billion in venture funding from an array of venture firms and strategic backers, including Google, Nvidia's Nventures, Microsoft, OMERS and Eclipse, among them, also securing a $2 billion loan commitment from the Department of Energy. Redwood Materials now boasts a valuation of over $6 billion. The incoming CFO also lauded Redwood Materials for work that ensures critical minerals, like lithium, cobalt, nickel and others, "stay within the country." Such minerals are crucial for the production of consumer electronics, vehicles, defense and energy products."That's super motivating for me â the scale of how much this is going to grow, and the critical need for it in the country," he said. Read more CNBC tech newsNintendo stock plunges after Switch 2 price hike and weak sales forecastAlphabet's 160% rally in a year reflects value of owning 'most of the stack' in AINvidia embraces role of AI investor, pushing past $40 billion in equity bets this yearWall Street sees 'changing of the guard in AI' as Intel, AMD shares soar while Nvidia lags After he resigned from Tesla in 2019, Ahuja served as CFO of Verily Life Sciences, then in 2022 joined Zipline, the drone delivery company, where he worked as chief business and financial officer. Zipline, ranked at #46 in the 2025 CNBC Disruptor list, is the world's largest drone delivery company, and has logged more than 2.3 million commercial deliveries via drone to-date. The company recently closed an $800 million round of funding with a valuation of $7.8 billion. Zipline's delivery drones are fully electric. A Zipline spokesperson told CNBC that Ahuja remains a close advisor to the company. Redwood Materials views the batteries from EVs, and other machines and devices, as some of the most valuable energy assets in the country. That's because the batteries still have capacity to store energy when they reach the end of their useful life in vehicles and other devices, and in general, spent batteries contain critical minerals that can be extracted, and used in new products. In its early years, Redwood Materials focused on "closed loop" recycling, taking end-of-life electric vehicle batteries and scrap from car factories, and turning those into raw materials and components to make new battery cells.Today, the company also builds and deploys battery energy storage systems, which can store power derived from intermittent, renewable energy sources â like solar, wind and water â to use at a later time. The systems made by Redwood Materials include repurposed, or "second-life" EV batteries.The data center boom in the U.S. is driving significant demand for the systems, which are also used at factories, in defense operations and to stabilize grid operations. "If we don't have battery systems, our grid is just falling behind, and we can't have off-grid solutions for even large, industrial or commercial needs that we may have," Ahuja told CNBC.Ahuja arrives at Redwood Materials less than a month after the company implemented a restructuring, in which it cut about 10% of headcount, or 135 people, partly to refocus resources on its energy division, TechCrunch first reported.The company had been without a CFO for more than a year after its prior finance leader, Jason Thompson, left Redwood Materials and joined The Nuclear Company in Reno in December 2024. "Redwood today is the strongest it's ever been," Straubel wrote in a widely distributed email informing employees of the cuts on April 15. "The materials business is well on its way to profitability and has an exciting road map ahead and we're seeing great momentum in Redwood Energy."Ahuja told CNBC that he sees demand for fully electric vehicles growing in the U.S. despite some recent ups and downs.In its energy storage business, Redwood Materials has been striking deals with partners like Ford, Rivian and others, and has built a 12 megawatt and 63 megawatt-hour capacity microgrid, which it calls the "largest second-life battery deployment in the world," in Abilene, Texas, for the AI infrastructure company Crusoe. watch nowVIDEO13:2713:27How used EV batteries are being used to power AI data centersTech Choose CNBC as your preferred source on Google and never miss a moment from the most trusted name in business news.
As public support for large-scale data center buildouts declines across the U.S., a new type designed to operate inside individual homes is coming. View More
watch nowVIDEO2:1502:15Homes could become mini data centers to power AI growthMorning Call Data centers are gobbling up land, driving up electric bills, and becoming a lightning rod for public discontent over big tech's power in society.Maine's legislature recently passed a data center ban in the state (but failed to override the governor's veto). According to the National Conference of State Legislatures, 14 states spanning the political spectrum from Oklahoma to New York are considering legislation that would ban or pause new data centers, as public opinion on AI has increasingly shifted to the negative.Still, despite the qualms of the public and politicians, there's a torrent of capital for building new data centers. The biggest technology companies in the U.S. are on pace to spend as much as $1 trillion annually by 2027 on AI, according to recent Wall Street estimates. Globally, a recent McKinsey report forecasts spending on data centers will hit $7 trillion by 2030. At the same time, the idea of putting data centers closer to consumers, even onto and into their homes, is gaining traction in real estate circles. Major players in housing, including homebuilder PulteGroup, are in early testing with Nvidia and California-based startup Span to install small fractional data center "nodes" on the exterior walls of newly built homes, according to recent reporting from CNBC's Diana Olick. The question of whether that model can scale, and whether homeowners, HOAs, and regulators will approve it, is up for debate. Experts point to some benefits to home-based data centers, with the home-based grid allowing for less construction needed on new ones and greater energy efficiency."It is technically possible and already being explored," said Balaji Tammabattula, chief operating officer at BaRupOn, a U.S.-based energy and technology company currently building out a data center campus in Liberty County, Texas. He said just as a home computer can contribute processing power to a distributed network, a home can host compute hardware that feeds into a larger data processing system. Advocacy groups and community members protest laws surrounding data centers while outside the Texas Capitol in Austin Monday, Feb. 23, 2026.Austin American-statesman/hearst Newspapers | Hearst Newspapers | Getty Images The home-as-data-center model would follow similar attempts at using latent home power for crypto mining or to sell excess rooftop solar power or EV credits. "Feasibility depends on available power, internet connectivity, heat management, and the type of workload. For batch processing and non-time-sensitive tasks, the home environment works surprisingly well," Tammabattula said, though for high-density AI training or real-time workloads, residential constraints are harder to overcome.Real-world examples are unfolding now as proof of concept, as heat waste from data centers as an issue receives more attention in Europe. For instance, a UK-based startup called Heata installs servers in people's homes that process cloud computing workloads while channeling the heat generated directly into the home's hot water cylinder, effectively giving homeowners free hot water in exchange for hosting the hardware. British Gas has backed a trial of this model. At a larger scale, operations have just commenced for heat pumps that route waste heat from Microsoft data centers in Finland to warm approximately 250,000 local residents' homes. "These examples show the concept working at both the household level and the community level," Tammabattula said. The home data center brings with it a ledger of pros and cons. On the positive side, the residential model reduces land and infrastructure requirements that are becoming serious bottlenecks, distributes compute closer to end users, and creates a natural incentive for homeowners through energy savings, said Tammabattula. He added that home computing also has a strong sustainability angle since waste heat gets repurposed rather than cooled away at great expense.But your questions for ChatGPT or Claude aren't likely to be generated from a server in someone's walk-in closet or basement soon, with those deep interactions with AI still require sprawling data centers. Residential environments currently lack the power density, redundancy, physical security, and environmental controls that enterprise workloads require. And if you can't get a signal for your own WiFi or phone call, you can't power a data center."Connectivity quality varies across households, creating reliability issues at scale. There are also regulatory and insurance questions around hosting commercial equipment in private homes," Tammabattula said. Currently, the economics only work for specific workload types like batch processing, rendering, and research computation. "Anything requiring guaranteed uptime or low latency is not a good fit for this model yet," he added.Home-based data center vs. the hyperscalerThe home data center is far more likely to become a niche layer of future infrastructure than a replacement for hyperscale data centers given the limitations. The home data center models also typically involve a third party owning and operating the equipment, so the homeowner does not need to manage anything technically. "Homes are not going to replace hyperscale data centers, especially for large AI training clusters that need dense power, high-speed networking, specialized cooling, and tightly controlled environments," said Gerald Ramdeen of Luxcore, a company developing next-generation optical networking and decentralized cloud infrastructure. He says a more realistic opportunity would be to turn homes into professionally managed edge compute nodes, useful for AI inference, low-latency workloads, flexible/batch compute, cloud gaming, and certain heat-reuse applications.This approach has implications for everyday life as it increasingly intersects with, and through, AI."It can be used to sort the seven bazillion photos your teenage daughter has," said Sean Farney, vice president of data center strategy for the Americas at JLL, a U.S.-based global professional services and commercial real estate firm that manages 4.4 GW of data center space globally from over 340 data center sites. Farney noted your smartphone has more computing capacity than the first data center ever built, so while the idea of a home data center hasn't taken off at scale yet, it probably will. "It's hard to compete with a hyperscaler because it's expensive operationally to maintain a super distributed footprint. But it can be done, and the company that gets it right is looking at a nice-sized valuation," he said.There are still some technical limitations to home data centers before success would be possible at commercial scale. For one, the home would need to have a supply of electrical and mechanical resources that are fairly reliable, since Farney says that a data center will exceed residential power supply really fast. "A 20-kilowatt residential generator doesn't even give you a cabinet of AI servers," he said. But if technology is able to address these issues, would homes be able to overcome the scale effect of data centers? Farney thinks the answer is yes.AI cybersecurity and physical security are issuesAimee Simpson, director of product marketing at Huntress, a global cybersecurity company, says one reason to be skeptical of home-based data centers catching on is the cybersecurity vulnerabilities."A collection of home-based micro data centers creates the need for a more robust network security approach," Simpson said. While there are potential decentralization benefits from a home-based network that is operating at scale â more sites means more redundancies in case any one data center goes down â expanding the footprint also makes security more complex. "Each site's hardware and software would need to be secure, and carefully monitored, to avoid any vulnerabilities," Simpson said. Physical security of the site, meanwhile, "would be almost impossible to guarantee," she said. "There's a reason that mega data centers run by the likes of Amazon and Microsoft are surrounded by high fences and guarded 24/7." The Microsoft data center campus, currently under construction, is reflected in Mount Pleasant, Wisconsin, September 18, 2025.Audrey Richardson | Reuters "I can't imagine a world where end users with data security and compliance obligations would be comfortable with the idea of their sensitive, confidential information being processed and managed by servers that are potentially sitting in someone's garage," Simpson said. Still, she knows of legitimate networks of micro data centers that use tamper-proof physical containers. If these could be located in residences, that could temper some security concerns.According to Arthur Ream, a computer information systems lecturer at Bentley University, the home-as-data-center model is plausible, already happening, and a sensible answer for inference workloads, if not training."The interesting question isn't whether residential compute works. It's whether the security, reliability, and regulatory story holds up at gigawatt scale or whether the industry has quietly figured out that the cheapest place to put the operational risk of AI is in someone else's utility room," Ream said.Span is pioneering the model, according to Ream, with examples like the work with Nvidia and PulteGroup where Span owns and installs liquid-cooled Nvidia RTX PRO 6000 Blackwell GPUs in residential homes, then sells the compute to hyperscalers and AI cloud providers while the homeowner gets a Span smart panel, battery backup, and discounted rates for electricity and internet. Homeowners pay a fee of roughly $150 month covering electricity and internet; installation is free while SPAN sells the compute to AI customers."The economic argument is the one to take seriously: a 100 MW data center costs roughly $15 million/megawatt and takes three to five years to build. Span claims it can match that capacity by deploying XFRA nodes across 8,000 new homes in about six months at $3 million/megawatt. Even haircut that aggressively for marketing math, the speed-to-power gap is real," Ream said.Other experts are less circumspect and say the concept won't work."Infrastructure for AI isn't infrastructure for crypto. You don't run data centers in basements," said Sviat Dulianinov, the chief strategy officer of Bright Machines, a San Francisco-based software and robotics company. Modern AI runs on "AI factories" of thousands of GPUs working together, requiring complex engineering, precision manufacturing, and tightly integrated supply chains: from server and rack build to deployment. "It also demands industrial-scale power and cooling. Compute will move closer to the edge, but it will be standardized, engineered systems versus crowdsourced home data centers," Dulianinov said.And with data centers drawing the ire of communities from coast to coast, real estate professionals are paying close attention to the developments, but have their own reservations about how residential communities will react. "HOAs would absolutely go to town on this idea," said Jeff Lichtenstein, president and founder of Echo Fine Properties in Palm Beach Gardens, Florida. "I can't even imagine our Facebook community page. Fighting between data companies and cities and homeowner associations would make typical Republican versus Democrat fighting look like child's play," Lichtenstein said. Choose CNBC as your preferred source on Google and never miss a moment from the most trusted name in business news.
Revenue from operations rose to ?1,453 crore in the quarter from ?1,194 crore in the same period a year ago View More
The investment story for Corning looks even sweeter in the wake of Jim Cramer's sit-down with CEO Wendell Weeks. View More
The investment story for Corning looks even sweeter in the wake of Jim Cramer's sit-down with CEO Wendell Weeks on "Mad Money" on Thursday night. One day after Corning's blockbuster optical partnership with Nvidia , Weeks shed some more light on the company's new supply agreements with two unnamed hyperscalers. Corning first disclosed these along with earnings last week, but details were light. Here's the comment from Weeks that caught our attention from Jim's interview: "Probably the biggest commercial arrangement ever in my career we just entered into with Nvidia, and then these other two major ones are larger than the Meta deal that's been public on, and I'm sure some of those customers will want to be more open about that over time." We knew they were about the same size as the Meta deal, which Corning went public with back in January, saying at the time it was worth up to $6 billion through 2030 to supply fiber optic cables for data centers. But to hear they are actually larger is, of course, encouraging. At the very least, it gives us a floor for these combined deals at about $12 billion. On Corning's April 28 earnings call, this is what Weeks had to say about the two new deals: "On our last call [in January], I shared that we were in the process of concluding other agreements of the same size and duration as the Meta agreement. We now have concluded two more large, long-term agreements with hyperscale customers. And they are each similar in size and duration. Now I know, we will get questions on who the other customers are and the specifics of our arrangements. However, our philosophy is to let our customers decide when, and where they choose to make announcements on their critical supply chain decisions. I can share that these deals are very significant, and they share the risk and rewards of the required expansions with our strategic customers." Now back to Thursday's remarks. Those two hyperscale deals we figured were worth about $6 billion each, now it seems they're more like at least $6 billion. Corning may be celebrating its 175th anniversary this week, but it's got the growth prospects of a young startup in a new field, which just so happens to be at the center of, well, everything. And, when you combine that outlook with the experience that comes from nearly two centuries of operation, you get something worth owning. Corning's execution may not have been perfect over those many, many decades, but the lessons learned are clearly on display with these recent deals. The company has been burned before, by capacity expansion investments made ahead of revenue that never materialized. However, that is exactly why we are now seeing deals like these, which have customers share not only in the rewards of capacity expansion but also in the risks that come with it. That's why we are so thrilled about these hyperscale deals and why we are more than happy to suffer a bit of dilution as Corning shareholders, in exchange for a deeper relationship with the sun at the center of the AI solar system, Nvidia. Jim also interviewed Nvidia CEO Jensen Huang on Thursday evening about his company's Corning alliance. Jensen said it will "revitalize American manufacturing." After all, there aren't many companies around like Corning that can say they went from ushering in the age of electricity with glass for light bulbs to ushering in the AI revolution with glass wires that will one day transport the world's data at the speed of light. There aren't many, but there is at least one â and we plan to hold onto it for the run that's about to take place on top of this year's already more than 100% surge in shares. (Jim Cramer's Charitable Trust is long GLW, NVDA, and META. See here for a full list of the stocks.) As a subscriber to the CNBC Investing Club with Jim Cramer, you will receive a trade alert before Jim makes a trade. Jim waits 45 minutes after sending a trade alert before buying or selling a stock in his charitable trust's portfolio. If Jim has talked about a stock on CNBC TV, he waits 72 hours after issuing the trade alert before executing the trade. THE ABOVE INVESTING CLUB INFORMATION IS SUBJECT TO OUR TERMS AND CONDITIONS AND PRIVACY POLICY , TOGETHER WITH OUR DISCLAIMER . NO FIDUCIARY OBLIGATION OR DUTY EXISTS, OR IS CREATED, BY VIRTUE OF YOUR RECEIPT OF ANY INFORMATION PROVIDED IN CONNECTION WITH THE INVESTING CLUB. NO SPECIFIC OUTCOME OR PROFIT IS GUARANTEED.