• Counterscale is an open-source web analytics service that is self-hostable on Cloudflare's developer cloud. It can be deployed as a single Cloudflare Worker that serves a JavaScript reporting snippet. Counterscale hosts a reporting endpoint and has a dashboard UI written in Remix. The heart of Counterscale is the Workers Analytics Engine, which is a set of managed services built on top of Clickhouse that allows for the writing and querying of analytics-style data at a massive scale. The service is essentially free for most users as it falls within Cloudflare's free plan limits.

  • Cloudflare is now offering its web hosting customers a way to block AI bots from scraping website content and using the data without permission to train machine learning models. It is able to recognize bot activity even when operators lie about their user agent. The bot detection approach relies on digital fingerprinting. With a network that sees an average of 57 million requests per second, Cloudflare has ample data to determine which fingerprints can be trusted.

  • Cloudflare's latest report reveals that nearly 7% of internet traffic is malicious, driven by events like wars and elections, with DDoS attacks being the primary weapon of choice.

  • A rarely used DNS record type called LOC specifies a physical location. While most DNS records handle basic website information, LOC records are used to store geographical coordinates. A bug in Cloudflare's DNS server (RRDNS) prevented it from serving LOC records correctly. This post goes through it can be fixed.

  • Proofpoint observed an increase in malware delivery through the abuse of Cloudflare Tunnels, particularly with the TryCloudflare feature. The campaign delivers remote access trojans (RATs) and has been evolving to bypass detection. It involves sending malicious emails with URLs or attachments, leading to the download and installation of malware like Xworm, AsyncRAT, VenomRAT, GuLoader, and Remcos.

  • Cloudflare plans to launch a marketplace in the next year that will allow website owners to sell access to AI model providers to scrape their site's content. It is part of the company's plan to give publishers greater control over how and when AI bots scrape their websites. The company launched free observability tools for customers on Monday as part of this plan, giving website owners a dashboard to view analytics on why, when, and how often AI models are crawling their sites for information. It is also letting customers block AI bots from their sites with a click of a button.

  • Cloudflare has introduced a significant enhancement to its Durable Objects (DO) by integrating zero-latency SQLite storage, fundamentally changing how applications can manage data in the cloud. Traditional cloud storage often suffers from latency due to network access and the need for synchronization across multiple clients. However, with Durable Objects, application code runs directly where the data is stored, eliminating the need for context switching and allowing for near-instantaneous access to data. Previously, Durable Objects provided only key/value storage, but the new integration with SQLite allows for a full SQL query interface, complete with tables and indexes. SQLite is widely recognized for its speed and reliability, making it an ideal choice for this new architecture. By embedding SQLite directly within the Durable Objects, Cloudflare enables applications to execute SQL queries with minimal latency, often completing in microseconds. Durable Objects are part of the Cloudflare Workers serverless platform, functioning as small servers that maintain state both in-memory and on-disk. Each DO can be uniquely addressed, allowing for global access and coordination of operations. This architecture is particularly beneficial for applications requiring real-time collaboration, such as document editing, where multiple users can interact with the same data seamlessly. The design of Durable Objects emphasizes scalability by encouraging the creation of multiple objects to handle increased traffic rather than relying on a single object. This approach allows for efficient management of state and traffic distribution across the network. One of the standout features of the new SQLite integration is the synchronous nature of database queries. Unlike traditional asynchronous database calls, which can introduce complexity and potential bugs, the synchronous queries in Durable Objects ensure that the application state remains consistent and predictable. This design choice simplifies coding and enhances performance, as the application can execute queries without waiting for I/O operations to complete. To address concerns about write durability, Cloudflare has implemented a mechanism called "Output Gates." This system allows applications to continue processing without waiting for write confirmations, while still ensuring that responses to clients are only sent after confirming that writes have been successfully stored. This dual approach maintains high throughput and low latency. The integration also simplifies common database issues, such as the "N+1 selects" problem, by allowing developers to write straightforward queries without needing to optimize for performance intricacies. Additionally, SQLite-backed Durable Objects offer point-in-time recovery, enabling users to revert to any state within the last 30 days, providing a safety net against data corruption. Developers can easily implement SQLite-backed Durable Objects by defining their classes and migrations in the Cloudflare environment. The pricing model for this new feature aligns with existing Cloudflare services, offering a competitive structure for SQL queries and storage. In contrast to Cloudflare's D1 product, which is a more managed database solution, SQLite-in-Durable Objects provides a lower-level building block for developers who want more control over their applications. D1 operates within a traditional cloud architecture, while SQLite-in-DO allows for colocated application logic and data storage, offering unique advantages for specific use cases. The underlying technology for this new feature is the Storage Relay Service (SRS), which efficiently manages data persistence by combining local disk speed with the durability of object storage. SRS records changes in a log format and utilizes a network of follower machines to ensure data integrity and availability. Overall, the introduction of zero-latency SQLite storage in Durable Objects represents a significant advancement in cloud computing, enabling developers to build faster, more reliable applications with enhanced data management capabilities.

  • Cloudflare has introduced a new feature called Speed Brain, designed to significantly enhance web page loading times by up to 45%. This innovation is particularly relevant in an era where performance is crucial for user engagement and retention. Speed Brain leverages the Speculation Rules API to anticipate user navigation patterns, allowing it to prefetch content before users actually click on links. This proactive approach means that when a user decides to navigate to a new page, the content is already cached in their browser, resulting in near-instant loading times. The initial implementation of Speed Brain focuses on prefetching static content when a user initiates a touch or click event. Future updates will introduce more aggressive models, including prerendering, which will not only fetch but also render the next page in advance. This capability aims to eliminate latency for static websites without requiring any configuration from users. To illustrate its effectiveness, consider an e-commerce site where users typically navigate from a parent category to specific items. By analyzing global request logs, Speed Brain can predict that a user viewing "Men's Clothes" is likely to click on "Shirts." Consequently, it begins delivering the relevant static content, such as images, even before the user clicks the link, resulting in an instantaneous page load when the click occurs. Early tests have shown that this method can reduce the Largest Contentful Paint (LCP) metric by up to 75%, which measures how quickly the largest visible element on a page loads. Speed Brain is available to all Cloudflare plan types at no additional cost. Users can enable it easily through their dashboard or API. For free domains, Speed Brain is activated by default, while Pro, Business, and Enterprise users need to enable it manually. Additionally, enabling Real User Measurements (RUM) is recommended to track performance improvements and optimize prefetching rules based on actual user behavior. Understanding how browsers load content is essential to appreciate the significance of Speed Brain. When a user navigates to a web page, the browser must establish a secure connection, send an HTTP request, and retrieve the necessary resources, including HTML, CSS, and JavaScript. This process can introduce delays, especially as users navigate through multiple pages. Speed Brain aims to mitigate these delays by prefetching and prerendering content based on user interactions, thus providing a smoother browsing experience. Historically, prefetching techniques have existed but often lacked the necessary data insights and flexibility. Previous methods required developers to manually specify which resources to prefetch, leading to inefficiencies. Cloudflare's Speed Brain addresses these limitations by dynamically determining prefetch candidates based on real-time user interactions, rather than relying on static configurations. The implementation of Speed Brain is made possible through the Speculation Rules API, which allows the browser to receive guidance on when to prefetch content. This approach minimizes the risk of stale configurations and incorrect prefetching, ensuring that resources are only fetched when they are likely to be needed. The initial conservative model prioritizes safety and efficiency, with plans to explore more aggressive settings in the future. Cloudflare's extensive global network enhances the effectiveness of Speed Brain by serving prefetched content directly from its CDN cache, significantly reducing latency. The feature is currently supported in Chromium-based browsers, with plans for broader adoption as the web community standardizes the Speculation Rules API. As Speed Brain continues to evolve, Cloudflare is exploring the integration of machine learning to refine its predictive capabilities further. This will enable more accurate prefetching based on user behavior, enhancing performance while maintaining user privacy. Future developments may also include prerendering capabilities and the potential bundling of Speed Brain with other performance-enhancing features like Argo Smart Routing. In conclusion, Speed Brain represents a significant advancement in web performance optimization, providing users with faster loading times and a more seamless browsing experience. Cloudflare encourages users to enable this feature and utilize RUM tools to monitor its impact on their website's performance.

  • Cloudflare recently reported successfully mitigating the largest recorded distributed denial-of-service (DDoS) attack, which peaked at an astonishing 3.8 terabits per second (Tbps). This attack targeted various organizations within the financial services, internet, and telecommunications sectors, marking a significant escalation in the scale of DDoS threats. The assault unfolded over a month, characterized by over 100 hyper-volumetric attacks that inundated the network infrastructure with excessive data, effectively overwhelming it. In a volumetric DDoS attack, the objective is to flood the target with massive amounts of data, consuming their bandwidth and exhausting the resources of applications and devices. This leaves legitimate users unable to access the services. The recent attacks were particularly intense, with many exceeding two billion packets per second and three Tbps. The compromised devices involved in these attacks were globally distributed, with a notable concentration in countries such as Russia, Vietnam, the United States, Brazil, and Spain. The threat actor behind this campaign utilized a diverse array of compromised devices, including Asus home routers, MikroTik systems, DVRs, and web servers. Cloudflare managed to autonomously mitigate all the DDoS attacks, with the peak attack lasting a mere 65 seconds. The attacks primarily employed the User Datagram Protocol (UDP), which allows for rapid data transfers without the need for a formal connection, making it a favored method for such assaults. Prior to this incident, Microsoft held the record for defending against the largest volumetric DDoS attack, which peaked at 3.47 Tbps and targeted an Azure customer in Asia. Typically, DDoS attackers rely on extensive networks of infected devices, known as botnets, or seek methods to amplify the data sent to the target, which can be achieved with fewer systems. In a related report, Akamai, a cloud computing company, highlighted vulnerabilities in the Common Unix Printing System (CUPS) that could be exploited for DDoS attacks. Their research indicated that over 58,000 systems were exposed to potential DDoS attacks due to these vulnerabilities. Testing revealed that numerous vulnerable CUPS servers could repeatedly send thousands of requests, demonstrating a significant risk for amplification attacks. This incident underscores the evolving landscape of cybersecurity threats, particularly the increasing scale and sophistication of DDoS attacks, and the importance of robust defenses against such challenges.