Network3 Announces New Local Large Language Model (LLM) Feature

Thursday, October 3, 2024 3:44 PM
15,376
Network3 Announces New Local Large Language Model (LLM) Feature cover

Network3, an AI Layer2 platform for global artificial intelligence developers, recently unveiled its latest innovation at the R3al World Summit in Singapore. The new Local Large Language Model (LLM) feature is designed to enhance the efficiency and performance of Edge AI technology. Edge AI involves deploying AI models directly on local devices like smartphones, bringing data processing closer to the source to improve application performance and reduce latency. The global Edge AI market is projected to exceed $269 billion in the next eight years, highlighting the growing significance of this technology.

Transforming Smart Devices into AI Training Assets

With a focus on making Web3 & AI technologies accessible, Network3 integrates DePIN with AI to enable IoT devices to train small AI models. By leveraging idle resources on smart devices, individuals can participate in AI training and earn rewards. The introduction of the local LLM feature aims to optimize smart devices’ processing capabilities during idle times, reducing reliance on cloud computing, cutting down bandwidth usage, and enhancing data security and privacy. Network3, with over 320,000 active nodes globally, recently launched the N3 Edge V1 mining device, offering dual mining capabilities for IoTeX and Network3 tokens.

Offering AI Chat Services on Mobile Devices

Network3’s latest update allows users to access AI chat services on their mobile devices without the need for expensive cloud infrastructure. A test version of the update will be available for download on the official website soon, providing users with the opportunity to interact with the model, earn tokens, and personalize their AI experience.

Buy Now at

Related News

New Jersey Devils Launch AI Chatbot 'Bott Stevens' for Enhanced Fan Engagement cover
7 days ago
New Jersey Devils Launch AI Chatbot 'Bott Stevens' for Enhanced Fan Engagement
The New Jersey Devils have introduced a groundbreaking AI chatbot named "Bott Stevens," aimed at enhancing digital fan engagement. This innovative chatbot is named after the legendary Devils player Scott Stevens and is powered by Theta EdgeCloud's decentralized AI infrastructure. Scheduled to launch during the 2024-25 NHL season, Bott Stevens will be accessible on the team's official website, providing fans with real-time information on game schedules, ticket sales, statistics, and merchandise. Utilizing Theta's Retrieval Augmented Generation technology, the chatbot will ensure data accuracy by sourcing information from official NHL channels, thereby minimizing the risk of misinformation from unverified sources. Bott Stevens boasts impressive computational capabilities, leveraging Theta EdgeCloud's network of over 30,000 edge nodes and distributed GPUs, which collectively offer more than 80 PetaFLOPS of processing power. This robust infrastructure is designed to handle peak demand, particularly during high-stakes events like playoffs or significant team announcements. In addition to answering fan inquiries, the chatbot will provide historical highlights, game recaps, venue information, and updates on team events. Future enhancements may include predictive analytics for fantasy sports and interactive tools to further engage fans. To promote Bott Stevens, the Devils plan to integrate its capabilities across multiple platforms, encouraging fan interaction and awareness. The chatbot will not only deliver statistics and schedules but also curated content that enriches the fan experience, such as historical highlights and information about upcoming events. Success metrics will include user engagement rates, accuracy of information, and feedback from fans, ensuring that Bott Stevens remains a valuable resource for the Devils' community. By focusing on continuous learning and personalized interactions, the team aims to differentiate Bott Stevens from other AI agents, creating a unique and engaging experience for fans.
Revolutionizing Confidential AI with Intel TDX and iExec cover
8 days ago
Revolutionizing Confidential AI with Intel TDX and iExec
The landscape of Confidential AI is undergoing a significant transformation, primarily driven by the introduction of Intel® Trust Domain Extensions (Intel® TDX). For years, developers faced challenges when trying to secure AI workloads using Intel SGX, which often required extensive modifications to applications and led to compatibility issues. This cumbersome process not only wasted time but also hindered the real-world adoption of Confidential AI solutions. However, with Intel TDX, developers can now run AI workloads in secure virtual machines without needing to rewrite their code, thereby streamlining the development process and enhancing performance. Intel TDX is designed to create a hardware-isolated trusted execution environment (TEE) that enhances data confidentiality and integrity in virtualized environments. Built into Intel’s 4th Generation Xeon® Scalable processors, TDX introduces Trust Domains that isolate virtual machines from the hypervisor and even cloud service providers. This isolation is crucial for AI applications that handle sensitive datasets and proprietary models, as it significantly reduces the attack surface while maintaining high performance. Additionally, TDX is optimized for AI workloads, leveraging advanced CPU capabilities to accelerate deep learning and machine learning models, making it a robust choice for developers. The collaboration between Intel TDX and iExec is paving the way for a new era of Confidential AI. As a Gold Member of the Intel Partner Alliance, iExec is at the forefront of this movement, providing solutions that enable secure, decentralized, and scalable execution of AI workloads. This partnership not only enhances the security of AI computations but also ensures compliance with data protection regulations. With practical applications in sectors like healthcare and finance, iExec empowers developers to build privacy-preserving AI applications that prioritize data ownership and secure computing, ultimately leading to a more trustworthy AI ecosystem.
AI Cryptocurrencies Bittensor and IntelMarkets Show Promising Growth Potential cover
9 days ago
AI Cryptocurrencies Bittensor and IntelMarkets Show Promising Growth Potential
AI-driven cryptocurrencies like Bittensor and IntelMarkets are experiencing a significant surge, with last month's impressive 40% increase drawing attention from analysts. The growing adoption of artificial intelligence and decentralized intelligence has positioned both TAO and INTL tokens for potential substantial growth. Investors are left to ponder whether this is merely the onset of a larger movement in the crypto space, particularly as Bittensor's recent developments suggest a promising future. Bittensor's TAO token has recently gained traction in the DeFi AI sector, although it remains within a descending price channel. A minor bullish engulfing pattern has emerged, hinting at a possible breakout. Factors contributing to this optimism include President Trump's substantial investment plan for AI, which, despite focusing on centralized solutions, may elevate discussions around decentralized AI, where Bittensor is making strides. Additionally, a partnership between Zuvu AI and Vana aims to enhance decentralized AI within Bittensor, potentially paving the way for TAO to reach the ambitious $1,000 mark under favorable market conditions. On the other hand, IntelMarkets is democratizing access to AI tools for everyday traders, previously available only to hedge funds. By providing advanced trading bots and real-time alerts, IntelMarkets empowers small traders to make informed decisions. The platform also emphasizes education, offering resources to simplify complex investment strategies. With its robust security system, Codeum, IntelMarkets ensures user assets remain protected. Currently in Stage 10 of its presale, the INTL token is priced attractively at $0.092, with predictions suggesting it could rival Bittensor's market cap, offering early investors a potential 20,000% return on investment.
iExec's Decentralized Confidential Computing: A Solution for Web3 Privacy Challenges cover
9 days ago
iExec's Decentralized Confidential Computing: A Solution for Web3 Privacy Challenges
The Web3 revolution has brought forth promises of enhanced ownership, transparency, and security for users. However, a significant challenge remains: the issue of data privacy and security. Blockchain technology, while offering pseudonymity, records every transaction on a public ledger, which means that true privacy is elusive. Through extensive on-chain analysis, individuals can be traced, exposing them to potential tracking and security threats. To genuinely realize the vision of Web3, developers must prioritize the integration of privacy-centric solutions that safeguard user data while upholding transparency and security. One of the critical vulnerabilities in Web3 is the handling of data in use. Traditional security measures often focus on data at rest or in transit, neglecting the sensitive data actively processed by applications. Without proper protection, this data remains unencrypted in memory, making it susceptible to breaches and unauthorized access. iExec is addressing this issue with its innovative Decentralized Confidential Computing (DeCC) approach, which combines the decentralization of blockchain with hardware-based security to protect data in use, thus enabling users to maintain ownership and monetize their information securely. iExec's protocol integrates off-chain confidential computing with on-chain blockchain security through its Proof of Contribution (POCO) smart contracts. This synergy allows developers to create trust-driven decentralized applications (dApps) that manage sensitive data without exposing it to third parties. Tools such as DataProtector and Web3Mail empower developers to encrypt data, manage access dynamically, and monetize digital assets while ensuring privacy. As AI development continues to face challenges regarding data privacy and fair compensation for contributors, iExec's Confidential AI solutions promise secure and scalable workflows, ensuring that data contributors retain control and value in the evolving landscape of Web3.
StrikeBit Partners with Aethir to Enhance AI Development cover
11 days ago
StrikeBit Partners with Aethir to Enhance AI Development
StrikeBit has recently announced a strategic partnership with Aethir, aimed at providing AI developers with the necessary computing resources to develop and scale AI agents efficiently. This collaboration integrates StrikeBit's advanced technology with Aethir's decentralized GPU computing infrastructure, ensuring secure and effective AI development and deployment. StrikeBit, a cryptocurrency trading platform, utilizes AI agents to analyze market trends and implement strategies, thus creating new investment opportunities for crypto users. Aethir stands out as a decentralized GPU computing platform that offers high-performance cloud infrastructure, enabling businesses to access powerful GPUs for various applications, including AI models and gaming. The partnership is particularly significant as it allows StrikeBit's AI developers to leverage Aethir's efficient GPU technology, which is crucial for processing vast amounts of data and refining AI products. This integration not only enhances the capabilities of StrikeBit's AI agents but also ensures that the development process remains cost-effective and accessible for smaller teams. The implications of this partnership are profound. By utilizing Aethir's decentralized computing power, StrikeBit is fostering an ecosystem that prioritizes privacy, accessibility, and ownership while delivering high-performing computing resources. This collaboration eliminates the reliance on traditional centralized cloud services, which can be prohibitively expensive. As a result, StrikeBit's AI developers can expedite their workflows, expand their operations, and explore new opportunities without the usual constraints, ultimately transforming the landscape of AI development in the cryptocurrency space.
io.net and Aethir Collaborate to Enhance Decentralized GPU Computing cover
13 days ago
io.net and Aethir Collaborate to Enhance Decentralized GPU Computing
In a significant move for the decentralized computing landscape, io.net and Aethir have announced a strategic collaboration aimed at enhancing GPU access and performance for applications in AI, machine learning, and gaming. By integrating io.net's advanced virtualization technology with Aethir's enterprise-grade distributed GPU cloud, the partnership seeks to create a robust, low-latency, and cost-effective solution tailored for GPU-intensive workloads. This collaboration comes at a time when the demand for GPU computing is surging, with projections indicating that the market could quadruple in size by 2030. The alliance between io.net and Aethir is designed to provide a highly scalable and efficient solution to meet the growing global demand for GPU resources. io.net’s cutting-edge virtualization and orchestration capabilities will enable AI and machine learning engineers to deploy Ray and Kubernetes clusters seamlessly across a network of over 600,000 decentralized GPUs and CPUs. Meanwhile, Aethir's distributed cloud infrastructure is set to deliver enterprise clients in the AI, machine learning, and gaming sectors with fast and scalable GPU cloud resources, leveraging a network of over 40,000 high-performance GPUs, including 3,000 NVIDIA H100s. Under the terms of this collaboration, both companies will integrate their ecosystems to offer customers a seamless GPU computing experience across various workloads, including clustering and serverless inferencing. This reciprocal integration will allow io.net’s clustering solutions to be accessible on Aethir’s platform, providing enterprises with a diverse range of GPU-based computing options. Additionally, both companies plan to collaborate on marketing and community initiatives, further enhancing the overall ecosystem. As part of their partnership, an airdrop will distribute $50 million worth of tokens to community members of both platforms, marking a significant milestone in their joint mission to democratize high-performance compute access for all.
Signup for latest DePIN news and updates