Mouser Left Banner
Mouser Left Banner
Mouser Left Banner
Mouser Left Banner
Mouser Left Banner
Mouser Left Banner
More

    From edge to cloud: The critical role of hardware in AI applications

    The rise of generative artificial intelligence

    The blog explains artificial intelligence and automation in technology

    AI is quickly becoming ubiquitous now. The world has woken up to the power of generative AI and a whole ecosystem of applications and tools are quickly coming to life. It is becoming increasingly important in various industries, including healthcare, finance, and transportation.

    All this has a tremendous impact on the digital value chain and the semiconductor hardware market that cannot be overlooked. The apps and tools have to gather, process and deliver back data to the consumer with minimal latency. This also has to happen at scale given the rapidly emerging ecosystem.

    Hardware innovations become imperative to sustain this revolution. As I think about AI growth and its impact on hardware systems and silicon roadmap, my own journey as a content creator using Midjourney for art or chatGPT for editorial help (like for this blog), certainly helps with the big picture.

    F1_eggs_basket_AI
    Image by Midjourney, Inc.

    So what does it take on the hardware side?

    An increased demand for high-performance computing for cloud data centers

    AI workloads require specialized processors that can handle complex algorithms and large amounts of data. This has led to the development of new processors, such as graphics processing units (GPUs), field-programmable gate arrays (FPGAs) and custom AI silicon that are optimized for AI workloads.

    Memory and storage

    The vast amount of data generated by AI workloads requires high-capacity storage solutions that can handle both structured and unstructured data. Solid-state drives (SSDs) and non-volatile memory express (NVMe) enable faster data access and processing.

    Auto EV India

    Wired connectivity as the binding thread

    Computing speeds loosely measured as a function of Moore’s Law does not scale with the burgeoning data needs from AI. This means that the interconnects between computing units processing in parallel become critical. Wired connectivity is also the lynch pin between the computing devices, the GPUs, storage and memory. The rise of AI means that the network is the computer, and connectivity its lifeline.

    Wireless at the edge

    It goes without saying that our digital experiences today are wireless. To this end, it is important that our wireless broadband networks are capable of handling high-speed low latency data at scale. Cellular technologies and Wi-Fi work in a complementary manner to meet this demand. 5G deployments continue to improve cellular network capacities and provide top speeds. Wi-Fi innovations, together with the newly available 6 GHz band, have culminated in Wi-Fi 7 that directly addresses the issue of edge latency while ensuring top speeds.

    Cybersecurity

    With the increasing use of AI, cybersecurity is becoming increasingly important to protect against cyber threats and attacks. This requires specialized hardware and software, such as intrusion detection systems and encryption technologies.

    At Broadcom, we’ve always sought to connect everything. That’s our mission – to have a world connected by Broadcom. We are very proud of the fact that 99% of the World’s data passes through at least one Broadcom chip. For us, the AI hardware needs are in the continuum of what we do every day. Our wired connectivity powers hyperscalers’ data centers, and operator networks. At the edge, our wireless broadband solutions power your homes and bring data to your hands. Our software solutions provide the layer of security required. We love that AI is disrupting the status quo, and are proud to provide critical hardware to enable this.

    Before I close, let me also briefly preview some of the innovation we focus on.

    Advanced manufacturing processes: These are vital for the production of AI hardware. The use of 7-nanometer and 5-nanometer manufacturing processes creates smaller and more powerful chips. Our chips carry higher data loads, and deliver the low latencies needed for AI.

    Custom designs: We innovate data center storage and connectivity solutions that are optimized for specific AI workloads.

    Power efficiency: Complex workloads require larger amounts of power, which can lead to increased energy costs among other things. This is an area of focus for both our wired and wireless chips. For example, on our Wi-Fi chips used in phones, we steadfastly work on radio optimizations and architectural modifications each generation with an eye to disruptively lower power consumption.

    These are just a few examples of our innovation focus. In a series of follow-up blogs, we are hoping to further delve into what Broadcom has to offer as AI takes off.

    Vijay_Nagarajan_author
    Vijay Nagarajan | Broadcom
    ELE Times Report
    ELE Times Reporthttps://www.eletimes.com/
    ELE Times provides extensive global coverage of Electronics, Technology and the Market. In addition to providing in-depth articles, ELE Times attracts the industry’s largest, qualified and highly engaged audiences, who appreciate our timely, relevant content and popular formats. ELE Times helps you build experience, drive traffic, communicate your contributions to the right audience, generate leads and market your products favourably.

    Technology Articles

    Popular Posts

    Latest News

    Must Read

    ELE Times Top 10