Mizzle Secures $1M Investment from Onextel to Accelerate Decentralized Infrastructure Development

Monday, July 15, 2024 10:53 AM
411
Mizzle Secures $1M Investment from Onextel to Accelerate Decentralized Infrastructure Development cover

Mizzle, a DePIN platform, has received a $1 million investment from tech investor Onextel to enhance its compute and storage platform. The funding will be used to improve platform capabilities, security features, and user experience. Mizzle’s technology combines AI-based DevOps, eBPF security, and advanced encryption to offer a secure and efficient decentralized infrastructure solution. The investment is expected to help Mizzle capture a significant share of the growing DePIN market.

Related News

The Decentralization of AI Computing: A New Era of Demand and Efficiency cover
2 days ago
The Decentralization of AI Computing: A New Era of Demand and Efficiency
The AI industry is currently experiencing a pivotal moment characterized by the emergence of smaller and more efficient models, such as DeepSeek. Contrary to expectations, these advancements do not diminish the demand for computing resources; instead, they amplify it, aligning with Jevons’ Paradox, which suggests that increased efficiency can lead to greater overall consumption. As AI models become cheaper, faster, and more accessible, the demand for computing power continues to rise, raising critical questions about how to support widespread AI inference without creating new bottlenecks in the existing infrastructure. Historically, AI has depended on large-scale centralized infrastructure controlled by hyperscalers, which has led to concerns about accessibility, pricing, and availability. However, the introduction of models like DeepSeek challenges this paradigm by demonstrating that efficiency gains can create new pressures on computing resources. As more individuals and organizations adopt AI technologies, the total compute demand is skyrocketing, particularly as open-source alternatives gain traction. This shift is evident in the rapid development of free and open-source models that outperform proprietary options, allowing startups and independent developers to participate in the AI landscape without the constraints imposed by traditional cloud providers. As the demand for scalable and cost-effective AI infrastructure increases, decentralized computing is emerging as a viable solution. By distributing workloads across a global network of high-performance GPUs, this model addresses many inefficiencies associated with centralized systems. Decentralization not only enhances cost efficiency and scalability but also provides greater privacy and control over data. The success of models like DeepSeek illustrates the need for a shift toward distributed AI computing, where developers and researchers can operate independently of monopolized cloud infrastructure. The future of AI computing is not about reducing demand but adapting to an ever-growing need for computational power, ensuring that the AI ecosystem evolves in tandem with its advancements.
Mawari Launches Digital Entertainment City Namba: A Smart City Revolution cover
2 days ago
Mawari Launches Digital Entertainment City Namba: A Smart City Revolution
In a groundbreaking collaboration, Mawari has joined forces with Nankai Electric Railway Co., Ltd., Meta Osaka Co., Ltd., and e-stadium Co., Ltd. to launch the "Digital Entertainment City Namba" in Osaka, Japan. This innovative project aims to create the world’s first smart city that integrates artificial intelligence (AI), extended reality (XR), and decentralized physical infrastructure networks (DePIN) on a city-wide scale. By leveraging the unique strengths of each partner, the initiative seeks to blend advanced technology with everyday urban life, fostering a vibrant digital culture and addressing social challenges through community engagement. Mawari's pivotal role involves deploying edge computing and rendering devices across Nankai’s properties to establish a decentralized streaming infrastructure. This setup enhances user experience by minimizing latency and enabling real-time interactions with lifelike AI avatars. These avatars are designed to assist in various tasks, such as guiding tourists and facilitating communication across language barriers. The project is a significant milestone for Mawari, as it aims to democratize AI-driven immersive experiences, with CEO Luis Oscar Ramirez emphasizing its potential for mass adoption and tangible social impact. The projected impact of the Digital Entertainment City Namba extends beyond entertainment, targeting sectors such as tourism and labor. By providing multilingual 3D guides and immersive cultural experiences, the project aims to attract more foreign visitors to Japan, thereby boosting local businesses. Additionally, it addresses Japan's labor shortage by creating flexible, remote work opportunities through AI-driven avatars, promoting inclusivity for diverse groups. This initiative not only enhances accessibility but also aligns with Japan’s pressing need for innovative workforce solutions, marking a transformative step towards a digitally integrated urban future.
Acurast Integrates Monad Testnet to Enhance Blockchain Capabilities cover
2 days ago
Acurast Integrates Monad Testnet to Enhance Blockchain Capabilities
Acurast has made significant strides by successfully integrating Monad’s testnet into its network, which enhances the capabilities of real-time, high-performance blockchain technology. This integration allows Monad to be seamlessly accessible through Acurast, thereby streamlining the onboarding process for projects within the Monad ecosystem. The collaboration enables projects to utilize Acurast’s decentralized compute network, which offers efficient and cost-effective off-chain computation, ultimately strengthening the Monad ecosystem and providing builders with a more accessible way to leverage powerful processing capabilities. Monad is distinguished for its cutting-edge blockchain infrastructure that emphasizes real-time execution and scalability. The integration with Acurast enhances Monad’s core advantages, such as optimized execution layers that significantly improve transaction throughput and efficiency. Acurast complements this by providing a decentralized compute environment that ensures data integrity and security while maintaining real-time responsiveness. This synergy allows developers to innovate without sacrificing speed or security, paving the way for advanced applications like high-frequency trading algorithms and AI-driven solutions. The importance of this integration extends beyond mere technical enhancements; it signifies a pivotal moment for Decentralized Physical Infrastructure Networks (DePIN). While sectors like Gaming and DeFi have gained traction, DePIN represents the next frontier in decentralized technology. By merging Acurast’s compute network with Monad, the partnership is not only advancing computing capabilities but also making decentralized infrastructure more accessible and robust. With over 37,000 devices onboarded and around 170 million transactions on the Testnet, Acurast is poised to meet the growing demand for real-world applications in the crypto economy, shaping the future landscape of DePIN.
Digital Entertainment City Namba: A Fusion of AI and XR in Osaka cover
2 days ago
Digital Entertainment City Namba: A Fusion of AI and XR in Osaka
Digital Entertainment City Namba is an innovative extended reality (XR) project located in Osaka, Japan, which integrates artificial intelligence (AI) guides throughout the city. This initiative is powered by Mawari's decentralized physical infrastructure network (DePIN), showcasing how DePINs can effectively meet the computational demands of both XR and AI technologies. The collaboration involves key players such as Mawari, Meta Osaka, Nankai Electric Railway, and the Namba e-stadium, highlighting Osaka's rich cultural and technological heritage while pushing the boundaries of immersive experiences. The project aims to enhance tourist experiences by utilizing virtual AI guides capable of performing various tasks, from providing guidance to offering customer service. These AI-driven characters are designed to facilitate intuitive interactions that transcend language barriers, making tourism more accessible. The integration of AI within XR experiences presents unique challenges, particularly due to the significant computational power required for graphics rendering and AI processing. DePINs, like those offered by Mawari, promise to alleviate these challenges by leveraging decentralized GPU networks to reduce latency and bandwidth demands. As the demand for AI processing grows, the transition from graphical rendering to AI capabilities is becoming increasingly common among GPU DePINs. Notably, Render Network has successfully pivoted to include AI processing alongside its original focus on graphics. The Digital Entertainment City Namba project exemplifies this trend, illustrating the potential for decentralized GPU networks to support the convergence of XR and AI technologies. Furthermore, under Japanese law, DePIN tokens are classified as utility tokens, which helps to navigate regulatory challenges while fostering innovation in the sector.
Unlock unparalleled computing power with GPUs on Aleph Cloud  cover
2 days ago
Unlock unparalleled computing power with GPUs on Aleph Cloud
GPUs are now integrated into [Aleph Cloud](https://aleph.im/), providing high-performance computing for AI services, rendering, gaming, AI agents, and more. We are proud to announce a significant milestone for our decentralized cloud ecosystem with the latest integration of GPUs. The new Aleph Cloud Core Channel Node application introduces GPU integration, a transformative feature that positions Aleph Cloud as a powerful and more competitive solution for high-performance computing needs. In addition, this release brings account management capabilities, pricing estimations, and a host of performance improvements, making Aleph Cloud an even more robust platform for developers and enterprises alike. With GPU support, users can now unlock unparalleled computing power for tasks such as AI training, rendering, gaming, and running AI agents. By leveraging decentralized infrastructure, Aleph Cloud offers both scalability and flexibility at highly competitive prices, all through the convenience of a Pay-as-You-Go (PAYG) model using ALEPH tokens. # Revolutionizing cloud computing with GPU integration The integration of GPUs into Aleph Cloud marks a pivotal step forward. By enabling access to both consumer-grade and enterprise-grade GPUs, users can now select the exact resources they need through our intuitive interface at console.twentysix.cloud. This empowers them to scale their workloads seamlessly, whether they are building machine learning models, rendering complex graphics, or powering cloud-based gaming platforms. Each GPU model is paired with specific vCPU, RAM, and disk allocations, ensuring peak performance for any workload. For example, an RTX 4090, ideal for gaming or smaller AI projects, will cost around $240 per month*, while an enterprise-grade H100, designed for data-intensive operations, costs approximately $1,920 per month*. # Enhanced Pay-As-You-Go model Aleph Cloud’s PAYG model is designed to offer flexibility and cost transparency, a critical advantage for businesses and developers. With ALEPH tokens as the exclusive payment method, users can enjoy a seamless payment experience across blockchains like Base, and Avalanche. This ensures not only affordability but also accessibility to a global audience. Additionally, when a user provisions a GPU-powered virtual machine and initiates a payment, the system will automatically create two payment streams. 80% of the payment will be allocated directly to the node operator providing the compute power, while 20% will contribute to the Aleph Cloud treasury. This model ensures a fair and sustainable distribution of resources, incentivizing node operators while also reinvesting in the continued growth and development of the Aleph Cloud ecosystem. The pricing structure is straightforward yet highly competitive. For instance: * Consumer GPUs like the RTX 4090 are priced at ~$111 per month, while an RTX 4000 ADA costs only ~$55 per month*. * Enterprise GPUs such as the A100 and H100 are geared towards intensive tasks, priced at ~$875 and ~$583 per month*, respectively. This transparent structure enables users to budget accurately, scaling their resources as their projects evolve. ![Aleph pricing model](https://pbs.twimg.com/media/GkKhUsAXwAE7jv9?format=png&name=large) Pricing breakdown as of 18/02/2025 # New features for an optimized user experience In addition to GPU integration, this release introduces several features aimed at enhancing performance and usability. Among the key highlights: * Aggregates caching: By implementing caching for owner aggregates and metrics data, the platform now offers faster response times and smoother performance. * Pricing estimation: Users can now estimate the cost of executable messages before sending them, providing greater transparency and control. * Pre-calculated balances: This feature significantly improves system performance by reducing computational overhead. * Cost service updates: The cost service now uses pricing aggregates for more accurate estimations. These enhancements collectively make Aleph Cloud more reliable and user-friendly, reinforcing its position as a leading solution for decentralized cloud computing. ## Unlocking new use cases with GPUs The addition of GPUs extends Aleph Cloud’s capabilities to serve a diverse range of industries and applications: * AI and machine learning: Train and deploy machine learning models faster and more efficiently. * Rendering: Create stunning visual effects and high-quality 3D graphics with reduced rendering times. * Gaming: Deliver seamless, high-performance experiences for cloud gaming platforms. * AI agents: Power advanced AI agents with robust, decentralized computing resources. These use cases showcase the versatility of Aleph Cloud’s GPU offering, making it a compelling choice for developers, creators, and businesses alike. # Looking Ahead The release of GPU integration and the accompanying features represents a significant leap forward for Aleph Cloud and its community. As we continue to enhance our platform, we remain committed to empowering users with the tools they need to innovate, scale, and succeed in a decentralized world. To explore these new features and start leveraging GPUs on Aleph Cloud, visit [console.twentysix.cloud](https://console.twentysix.cloud/computing/gpu-instance/) today. For more updates, follow us on Twitter and join our Discord community. Together, let’s redefine the future of cloud computing. *Those pricing are converted in USD from the current price of ALEPH token. This data should evolve in time*
Zeeve Launches Cogitus: A Game-Changer for Avalanche L1 Testnet Deployment cover
3 days ago
Zeeve Launches Cogitus: A Game-Changer for Avalanche L1 Testnet Deployment
On February 20, 2025, Zeeve, a prominent provider of blockchain infrastructure solutions, unveiled Cogitus, a one-click deployment platform for Avalanche Layer 1 (L1) testnets. This innovative platform allows Web3 startups and developers to launch fully functional public testnets for just $50 for the first six months. Traditionally, deploying an independent L1 network has required significant resources and extensive DevOps expertise, but Cogitus aims to simplify this process, making it more accessible for developers across various industries, including gaming and finance. Avalanche L1s have gained popularity due to their customizable and sovereign blockchain environments. They offer independent governance, validator structures, and economic models while ensuring interoperability within the broader Avalanche ecosystem. With the launch of Cogitus, developers can now deploy and experiment with their own testnets, adjusting configurations in real-time to fit their specific use cases. This flexibility positions Avalanche L1s as a preferred solution for high-performance decentralized applications (dApps) and Web3 startups looking for scalable blockchain solutions. Key features of Cogitus include fully loaded public testnets with preconfigured validator nodes, RPC endpoints, and dedicated explorers, all ready for deployment. The platform automates network configuration, eliminating the technical complexities that often accompany such setups. Additionally, it offers 24/7 proactive monitoring and multi-cloud redundancy to ensure high availability. With a discounted mainnet deployment available for $995, Cogitus by Zeeve is set to revolutionize the way developers launch and scale Avalanche L1s, enabling rapid innovation in sectors like gaming, DeFi, and AI-driven networks.
Signup for latest DePIN news and updates