Latest Storage and Data Protection News Highlights

Friday, August 23, 2024 3:33 PM
3,321

Assured Data Protection expands into Latin America by launching operations and offering Rubrik technology, ensuring cyber resiliency for enterprises. CUDOS and Storj collaborate to provide distributed compute and storage solutions for AI and modeling. DDN achieves tier-one performance data platform certification for NVIDIA Partner Network Cloud Partners. LiquidStack introduces CDU-1MW, a universally compatible coolant distribution unit for direct-to-chip liquid cooling. Rubrik introduces data protection for Salesforce, emphasizing the importance of backing up SaaS applications. RocketStor 6541x Series NVMe RAID Enclosures offer high performance and flexibility for PCIe Gen4 platforms. Corporate IT teams express concerns over data protection practices in the enterprise.

Buy Now at

Related News

The Decentralization of AI Computing: A New Era of Demand and Efficiency cover
2 days ago
The Decentralization of AI Computing: A New Era of Demand and Efficiency
The AI industry is currently experiencing a pivotal moment characterized by the emergence of smaller and more efficient models, such as DeepSeek. Contrary to expectations, these advancements do not diminish the demand for computing resources; instead, they amplify it, aligning with Jevons’ Paradox, which suggests that increased efficiency can lead to greater overall consumption. As AI models become cheaper, faster, and more accessible, the demand for computing power continues to rise, raising critical questions about how to support widespread AI inference without creating new bottlenecks in the existing infrastructure. Historically, AI has depended on large-scale centralized infrastructure controlled by hyperscalers, which has led to concerns about accessibility, pricing, and availability. However, the introduction of models like DeepSeek challenges this paradigm by demonstrating that efficiency gains can create new pressures on computing resources. As more individuals and organizations adopt AI technologies, the total compute demand is skyrocketing, particularly as open-source alternatives gain traction. This shift is evident in the rapid development of free and open-source models that outperform proprietary options, allowing startups and independent developers to participate in the AI landscape without the constraints imposed by traditional cloud providers. As the demand for scalable and cost-effective AI infrastructure increases, decentralized computing is emerging as a viable solution. By distributing workloads across a global network of high-performance GPUs, this model addresses many inefficiencies associated with centralized systems. Decentralization not only enhances cost efficiency and scalability but also provides greater privacy and control over data. The success of models like DeepSeek illustrates the need for a shift toward distributed AI computing, where developers and researchers can operate independently of monopolized cloud infrastructure. The future of AI computing is not about reducing demand but adapting to an ever-growing need for computational power, ensuring that the AI ecosystem evolves in tandem with its advancements.
Mawari Launches Digital Entertainment City Namba: A Smart City Revolution cover
2 days ago
Mawari Launches Digital Entertainment City Namba: A Smart City Revolution
In a groundbreaking collaboration, Mawari has joined forces with Nankai Electric Railway Co., Ltd., Meta Osaka Co., Ltd., and e-stadium Co., Ltd. to launch the "Digital Entertainment City Namba" in Osaka, Japan. This innovative project aims to create the world’s first smart city that integrates artificial intelligence (AI), extended reality (XR), and decentralized physical infrastructure networks (DePIN) on a city-wide scale. By leveraging the unique strengths of each partner, the initiative seeks to blend advanced technology with everyday urban life, fostering a vibrant digital culture and addressing social challenges through community engagement. Mawari's pivotal role involves deploying edge computing and rendering devices across Nankai’s properties to establish a decentralized streaming infrastructure. This setup enhances user experience by minimizing latency and enabling real-time interactions with lifelike AI avatars. These avatars are designed to assist in various tasks, such as guiding tourists and facilitating communication across language barriers. The project is a significant milestone for Mawari, as it aims to democratize AI-driven immersive experiences, with CEO Luis Oscar Ramirez emphasizing its potential for mass adoption and tangible social impact. The projected impact of the Digital Entertainment City Namba extends beyond entertainment, targeting sectors such as tourism and labor. By providing multilingual 3D guides and immersive cultural experiences, the project aims to attract more foreign visitors to Japan, thereby boosting local businesses. Additionally, it addresses Japan's labor shortage by creating flexible, remote work opportunities through AI-driven avatars, promoting inclusivity for diverse groups. This initiative not only enhances accessibility but also aligns with Japan’s pressing need for innovative workforce solutions, marking a transformative step towards a digitally integrated urban future.
Acurast Integrates Monad Testnet to Enhance Blockchain Capabilities cover
2 days ago
Acurast Integrates Monad Testnet to Enhance Blockchain Capabilities
Acurast has made significant strides by successfully integrating Monad’s testnet into its network, which enhances the capabilities of real-time, high-performance blockchain technology. This integration allows Monad to be seamlessly accessible through Acurast, thereby streamlining the onboarding process for projects within the Monad ecosystem. The collaboration enables projects to utilize Acurast’s decentralized compute network, which offers efficient and cost-effective off-chain computation, ultimately strengthening the Monad ecosystem and providing builders with a more accessible way to leverage powerful processing capabilities. Monad is distinguished for its cutting-edge blockchain infrastructure that emphasizes real-time execution and scalability. The integration with Acurast enhances Monad’s core advantages, such as optimized execution layers that significantly improve transaction throughput and efficiency. Acurast complements this by providing a decentralized compute environment that ensures data integrity and security while maintaining real-time responsiveness. This synergy allows developers to innovate without sacrificing speed or security, paving the way for advanced applications like high-frequency trading algorithms and AI-driven solutions. The importance of this integration extends beyond mere technical enhancements; it signifies a pivotal moment for Decentralized Physical Infrastructure Networks (DePIN). While sectors like Gaming and DeFi have gained traction, DePIN represents the next frontier in decentralized technology. By merging Acurast’s compute network with Monad, the partnership is not only advancing computing capabilities but also making decentralized infrastructure more accessible and robust. With over 37,000 devices onboarded and around 170 million transactions on the Testnet, Acurast is poised to meet the growing demand for real-world applications in the crypto economy, shaping the future landscape of DePIN.
Digital Entertainment City Namba: A Fusion of AI and XR in Osaka cover
2 days ago
Digital Entertainment City Namba: A Fusion of AI and XR in Osaka
Digital Entertainment City Namba is an innovative extended reality (XR) project located in Osaka, Japan, which integrates artificial intelligence (AI) guides throughout the city. This initiative is powered by Mawari's decentralized physical infrastructure network (DePIN), showcasing how DePINs can effectively meet the computational demands of both XR and AI technologies. The collaboration involves key players such as Mawari, Meta Osaka, Nankai Electric Railway, and the Namba e-stadium, highlighting Osaka's rich cultural and technological heritage while pushing the boundaries of immersive experiences. The project aims to enhance tourist experiences by utilizing virtual AI guides capable of performing various tasks, from providing guidance to offering customer service. These AI-driven characters are designed to facilitate intuitive interactions that transcend language barriers, making tourism more accessible. The integration of AI within XR experiences presents unique challenges, particularly due to the significant computational power required for graphics rendering and AI processing. DePINs, like those offered by Mawari, promise to alleviate these challenges by leveraging decentralized GPU networks to reduce latency and bandwidth demands. As the demand for AI processing grows, the transition from graphical rendering to AI capabilities is becoming increasingly common among GPU DePINs. Notably, Render Network has successfully pivoted to include AI processing alongside its original focus on graphics. The Digital Entertainment City Namba project exemplifies this trend, illustrating the potential for decentralized GPU networks to support the convergence of XR and AI technologies. Furthermore, under Japanese law, DePIN tokens are classified as utility tokens, which helps to navigate regulatory challenges while fostering innovation in the sector.
Unlock unparalleled computing power with GPUs on Aleph Cloud  cover
2 days ago
Unlock unparalleled computing power with GPUs on Aleph Cloud
GPUs are now integrated into [Aleph Cloud](https://aleph.im/), providing high-performance computing for AI services, rendering, gaming, AI agents, and more. We are proud to announce a significant milestone for our decentralized cloud ecosystem with the latest integration of GPUs. The new Aleph Cloud Core Channel Node application introduces GPU integration, a transformative feature that positions Aleph Cloud as a powerful and more competitive solution for high-performance computing needs. In addition, this release brings account management capabilities, pricing estimations, and a host of performance improvements, making Aleph Cloud an even more robust platform for developers and enterprises alike. With GPU support, users can now unlock unparalleled computing power for tasks such as AI training, rendering, gaming, and running AI agents. By leveraging decentralized infrastructure, Aleph Cloud offers both scalability and flexibility at highly competitive prices, all through the convenience of a Pay-as-You-Go (PAYG) model using ALEPH tokens. # Revolutionizing cloud computing with GPU integration The integration of GPUs into Aleph Cloud marks a pivotal step forward. By enabling access to both consumer-grade and enterprise-grade GPUs, users can now select the exact resources they need through our intuitive interface at console.twentysix.cloud. This empowers them to scale their workloads seamlessly, whether they are building machine learning models, rendering complex graphics, or powering cloud-based gaming platforms. Each GPU model is paired with specific vCPU, RAM, and disk allocations, ensuring peak performance for any workload. For example, an RTX 4090, ideal for gaming or smaller AI projects, will cost around $240 per month*, while an enterprise-grade H100, designed for data-intensive operations, costs approximately $1,920 per month*. # Enhanced Pay-As-You-Go model Aleph Cloud’s PAYG model is designed to offer flexibility and cost transparency, a critical advantage for businesses and developers. With ALEPH tokens as the exclusive payment method, users can enjoy a seamless payment experience across blockchains like Base, and Avalanche. This ensures not only affordability but also accessibility to a global audience. Additionally, when a user provisions a GPU-powered virtual machine and initiates a payment, the system will automatically create two payment streams. 80% of the payment will be allocated directly to the node operator providing the compute power, while 20% will contribute to the Aleph Cloud treasury. This model ensures a fair and sustainable distribution of resources, incentivizing node operators while also reinvesting in the continued growth and development of the Aleph Cloud ecosystem. The pricing structure is straightforward yet highly competitive. For instance: * Consumer GPUs like the RTX 4090 are priced at ~$111 per month, while an RTX 4000 ADA costs only ~$55 per month*. * Enterprise GPUs such as the A100 and H100 are geared towards intensive tasks, priced at ~$875 and ~$583 per month*, respectively. This transparent structure enables users to budget accurately, scaling their resources as their projects evolve. ![Aleph pricing model](https://pbs.twimg.com/media/GkKhUsAXwAE7jv9?format=png&name=large) Pricing breakdown as of 18/02/2025 # New features for an optimized user experience In addition to GPU integration, this release introduces several features aimed at enhancing performance and usability. Among the key highlights: * Aggregates caching: By implementing caching for owner aggregates and metrics data, the platform now offers faster response times and smoother performance. * Pricing estimation: Users can now estimate the cost of executable messages before sending them, providing greater transparency and control. * Pre-calculated balances: This feature significantly improves system performance by reducing computational overhead. * Cost service updates: The cost service now uses pricing aggregates for more accurate estimations. These enhancements collectively make Aleph Cloud more reliable and user-friendly, reinforcing its position as a leading solution for decentralized cloud computing. ## Unlocking new use cases with GPUs The addition of GPUs extends Aleph Cloud’s capabilities to serve a diverse range of industries and applications: * AI and machine learning: Train and deploy machine learning models faster and more efficiently. * Rendering: Create stunning visual effects and high-quality 3D graphics with reduced rendering times. * Gaming: Deliver seamless, high-performance experiences for cloud gaming platforms. * AI agents: Power advanced AI agents with robust, decentralized computing resources. These use cases showcase the versatility of Aleph Cloud’s GPU offering, making it a compelling choice for developers, creators, and businesses alike. # Looking Ahead The release of GPU integration and the accompanying features represents a significant leap forward for Aleph Cloud and its community. As we continue to enhance our platform, we remain committed to empowering users with the tools they need to innovate, scale, and succeed in a decentralized world. To explore these new features and start leveraging GPUs on Aleph Cloud, visit [console.twentysix.cloud](https://console.twentysix.cloud/computing/gpu-instance/) today. For more updates, follow us on Twitter and join our Discord community. Together, let’s redefine the future of cloud computing. *Those pricing are converted in USD from the current price of ALEPH token. This data should evolve in time*
DIMO Unveils Login with DIMO: A New React SDK for Developers cover
3 days ago
DIMO Unveils Login with DIMO: A New React SDK for Developers
DIMO has officially launched "Login with DIMO," a React SDK designed to facilitate integration with its APIs and SDKs for developers within the DIMO ecosystem. This toolkit features pre-built, customizable components that not only save time but also enhance the user experience. By utilizing this new SDK, developers can create applications that leverage DIMO’s decentralized vehicle protocol, ensuring a consistent and intuitive experience for users across the platform. The "Login with DIMO" SDK simplifies the integration process by providing developers with ready-to-use React components that connect seamlessly with DIMO’s APIs for account authentication and transaction management. These components allow for full customization, enabling developers to maintain their app's branding while ensuring a cohesive user experience across various DIMO applications. The SDK also supports multiple integration modes, catering to different use cases and expediting the development process, allowing developers to focus on innovation. Currently, "Login with DIMO" is already being utilized in several live applications, showcasing its effectiveness in enhancing user experience and streamlining development. Some examples include Roil, which offers hidden charging rewards, and DLP Labs, which allows users to contribute vehicle data for AI training. The benefits for developers include faster integration and customizable components, while users enjoy an intuitive login experience and improved usability. Looking ahead, DIMO plans to introduce features such as customizable text and UTM tracking, further enhancing the SDK's capabilities and solidifying its role in the future of DIMO’s decentralized vehicle protocol.
Signup for latest DePIN news and updates