Pressure Mounts for Apple as Brazilian Court Demands iOS Sideloading

A Brazilian court has dealt a major blow to Apple’s tightly controlled ecosystem, ordering the tech giant to allow sideloading on iOS within 90 days. The ruling follows similar mandates in the EU, signaling a global push for more open digital marketplaces. With Apple commanding nearly 60% of the U.S. mobile market and over 62% in Japan, the decision could set off a domino effect worldwide. How does sideloading affect iOS users? Sideloading occurs when a mobile smartphone user downloads an app from a source other than the official App Store. In this case, the App Store remains the sole distribution channel — a model that Apple is determined to protect. It’s clear why Apple wants to restrict third-party apps. According to StatCounter, the tech giant accounts for less than 30% of the global OS market share as of February 2025. Forcing users to download apps from the App Store is a surefire way to keep them in the iOS ecosystem. However, users do receive some benefit from downloading apps exclusively from the App Store. Since all apps undergo a screening process, users know they’re receiving authentic software that isn’t going to harm their device. If they ever do experience an issue, technical support is usually available. These safeguards don’t necessarily extend to apps that are downloaded from developer websites or other sources, but many users still want the freedom to choose. Despite already making similar accommodations in other regions, Apple insists that sideloading will have a negative impact on all iOS users. Judge Pablo Zuniga, who overturned an injunction that would have given Apple more time to consider their next move, said that Apple “has already complied with similar obligations in other countries, without demonstrating a significant impact or irreparable damage to its business model.” What does this mean for other countries? If the ruling stands, similar legal battles could emerge worldwide. With the European Union already setting a precedent, and Brazil potentially following suit, other countries may soon join the movement. While the case in Brazil could be a major catalyst for a future disruption in the iOS ecosystem, it’s still too early to tell. Following the latest ruling, Apple now has 90 days to remove all restrictions on sideloading for all Brazilian iOS users. As expected, the company plans to appeal the decision. source

Pressure Mounts for Apple as Brazilian Court Demands iOS Sideloading Read More »

4 European satellite firms are vying to replace Starlink in Ukraine

EU governments are in talks with four European satellite firms about providing a back-up service for Starlink in Ukraine, as the region pushes to boost tech sovereignty amid mounting transatlantic tensions.  Starlink has provided a vital communications system to Ukraine’s military since Russia’s full-scale invasion began in 2022. It allows the armed forces to coordinate drone strikes, identify targets, and stream battlefield data to troops on the ground in real-time. However, European leaders are increasingly concerned about relying on Starlink — fears stoked by a Reuters report that US officials had threatened to cut off the system in Ukraine if the country didn’t meet their demands on sharing its mineral wealth. Elon Musk, CEO of Starlink’s parent company SpaceX, refuted the claims. Nevertheless, the situation has raised doubts over the security implications of Ukraine — and broader Europe — relying on a single, privately owned network whose boss has direct ties to the Trump administration.  In response, discussions with European alternatives to Starlink are in full swing. Leading the pack is French/British satellite provider Eutelsat. The firm’s CEO, Eva Berneke, confirmed to Bloomberg last week that it was in talks with the EU about extending its internet service to Ukraine. Berneke said Eutelsat was also in “very positive talks” with Italy, whose government is fiercely debating whether to pick Starlink to provide encrypted communications for government officials.   Three other companies — the UK’s Inmarsat, Luxembourg’s SES, and Spain’s Hisdesat — told the Financial Times they were also discussing with governments and EU institutions about how to provide back-up connectivity to Ukraine. Miguel Ángel García Primo, CEO of Hisdesat, which provides secure satellite communications for governments, said his company had been contacted by several European officials. The 💜 of EU tech The latest rumblings from the EU tech scene, a story from our wise ol’ founder Boris, and some questionable AI art. It’s free, every week, in your inbox. Sign up now! But replacing Starlink won’t be easy. Starlink is by far the largest satellite comms provider, with over 7,000 probes in low-Earth orbit and 40,000 terminals on the ground in Ukraine alone.  However, Berneke was bullish on Eutelsat’s capabilities. She said that the firm could match Starlink’s terminal count in Ukraine “probably in a couple of months.” Eutelsat already has 2,000 terminals in the war-torn country. Whether a Starlink back-up service would comprise a mesh of different operators or just one is unclear at this stage. Either way, these four companies likely present Europe’s best option right now to cut ties with Musk’s firm and put the region’s satellite communications in safer hands.  Longer term, Europe has its bets placed on IRIS², a planned multi-orbit satellite internet constellation expected to switch on in 2030. There are also reports of an Airbus-Leonardo-Thales Alenia Space joint venture called “Project Bromo” that could challenge Starlink’s global dominance. source

4 European satellite firms are vying to replace Starlink in Ukraine Read More »

Dataiku unifies data, adds AI for better analytics outcomes

00:00 Hi, everybody. Welcome to DEMO, the show where companies come in and showcase their latest products and services. Today, I’m joined by Conor Jensen. He is the Global Field CDO at Dataiku. Welcome to the show, Conor. 00:10Thanks. I’m really happy to be here, Keith. 00:11And CDO—I’m going to guess that stands for Chief Data Officer? 00:13It does. Chief Data Officer. My background is in the data science space. Sometimes people think it means Chief Digital Officer. 00:21Now, Dataiku has several different pronunciations. Can you explain the company name and then tell us what you’re going to show us today? 00:28Sure. It’s really just a portmanteau of data and haiku. There’s nothing more to it than that. But because it’s a French company—founded in France about 12 years ago—the French tend to pronounce it Da-tie-koo with a very soft “H.” Americans usually say Data-IKU. Since we’re a global company, I’ve probably heard 20 to 30 different pronunciations. We welcome them all. 00:53At first, I thought it was a Japanese company because it reminded me of an anime show. 00:56Totally! I can see why. The Japanese pronounce it differently as well. So, yeah, there’s a wide range of ways people say it. 01:03All right. So give us an overview of what you’re going to show today and the whole purpose behind it. 01:07Dataiku, our core platform, allows people—regardless of their skill set, whether they’re coders, non-coders, or anywhere in between—to access data wherever they need it, whether in the cloud, on-premises, or elsewhere. It supports the full spectrum of analytics, from basic data analysis to machine learning and generative AI, all from a single UI with built-in governance. Today, we’re going to walk through a use case that demonstrates what working with Dataiku looks like in practice. 01:35A lot of companies that come on this show build products and platforms tailored to specific job roles within an enterprise. It feels like Dataiku is designed for a wide range of users across different roles. Would you say that’s the case? 01:49Absolutely. It’s for anyone who needs to work with data as part of their job. We see users from data engineering and data science, as well as analytics professionals in fields like sales and marketing. It’s a highly versatile platform with broad use across the enterprise. 02:04Got it. I’ve also had a lot of companies come on here that focus on data access—pulling data from multiple sources, compiling it, and generating insights. What makes Dataiku different from those other platforms? 02:14The key difference is that we don’t move the data. Dataiku operates as a UI layer on top of your existing data sources. Your data stays where it is. We do pull a small sample locally so users can interact with the data in real time—similar to working in Excel—but the actual data remains in its original source. When you finalize your work, the computations are pushed down to where the data resides, whether that’s a SQL database, a machine learning environment, or in-memory processing. This means you can use Dataiku to access data from local SQL servers, S3 buckets, or Kubernetes clusters without physically moving large datasets around. 02:58That makes sense. My last question before we jump into the demo—what would companies be doing if they didn’t have this product? Would they be using another platform, or would they be accessing data manually? 03:13Right, they’d be doing it manually—or at least in a more fragmented way. The reality is that these activities already exist, but they’re happening in siloed environments. For example, analysts might be using Excel, Alteryx, or point solutions. Engineers might be working in Tableau Data Prep or writing SQL scripts. Data scientists could be using notebooks, Databricks, or other tools. The problem is that these teams can’t easily collaborate. They often end up emailing Excel files back and forth, which is time-consuming and error-prone. Dataiku brings all of this together into a single platform where everyone—analysts, engineers, and data scientists—can work collaboratively. 03:59All right, let’s get to the demo. Since Dataiku is used across multiple job roles, I believe I asked you to come up with a use case scenario that highlights the platform in action. 04:10Exactly. We’ll walk through a use case that demonstrates how different users interact with the platform. When you log into Dataiku, it runs entirely in a browser. It’s hosted in the cloud, in a VPC, or on-premises. When you enter, you land on your homepage. For this demo, I’ll take on the role of a financial analyst in an FP&A (Financial Planning & Analysis) team. My job is to generate reports quickly and accurately. I have a project built out, but as an analyst, I don’t want to deal with the complexities of the backend—I just need my report. Here’s an example: I select a business unit, say The Americas without the USA, click a button, and instantly get a generated report. This report was drafted using an LLM (large language model), which pulls real-time data from structured databases. It’s not hallucinating numbers—it’s querying SQL databases and forecasts to provide a real-world summary of sales performance, what’s working, what’s not, and projected trends. 08:36Do companies need to know where all their data sources are, or does Dataiku help them discover that? 08:43Great question. Users can search for data within Dataiku’s catalog. Any data asset available to them will appear, making it easier to discover and access data without knowing exactly where it resides. If a user doesn’t have access to certain datasets, they’ll still see that the data exists and can request permissions from security teams. 09:13Because some companies struggle with simply knowing where all their data is, right? 09:21Exactly. That’s often the first challenge. Dataiku helps solve this by centralizing access while respecting existing security controls. 10:10I got you off track—let’s go back to the demo. 10:14No problem! So, we’ve looked at how an analyst interacts with the platform. Now, let’s look at the data engineer’s role. Behind the scenes,

Dataiku unifies data, adds AI for better analytics outcomes Read More »

6th Circ. Won't Revisit FCC's Tanked Net Neutrality Rules

By Christopher Cole ( March 11, 2025, 7:26 PM EDT) — The Sixth Circuit on Tuesday turned down a bid by public interest groups for a full-court rehearing of January’s decision to overturn the Federal Communications Commission’s net neutrality rules…. Law360 is on it, so you are, too. A Law360 subscription puts you at the center of fast-moving legal issues, trends and developments so you can act with speed and confidence. Over 200 articles are published daily across more than 60 topics, industries, practice areas and jurisdictions. A Law360 subscription includes features such as Daily newsletters Expert analysis Mobile app Advanced search Judge information Real-time alerts 450K+ searchable archived articles And more! Experience Law360 today with a free 7-day trial. source

6th Circ. Won't Revisit FCC's Tanked Net Neutrality Rules Read More »

Major AI market share shift revealed: DALL-E plummets 80% as Black Forest Labs dominates 2025 data

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More New data reveals dramatic AI market share shifts in 2025, with rapid changes in how businesses and consumers utilize artificial intelligence tools. Poe, a platform that hosts more than 100 AI models, has released a comprehensive report that provides an unprecedented look into real-world usage patterns across text, image and video generation technologies. Poe’s analysis, based on interactions from millions of users over the past year, offers technical decision-makers crucial insights into a competitive field where usage data is typically closely guarded. “As AI models continue to progress, we believe they will become central to how people acquire knowledge, tackle complex tasks and manage everyday work,” the company writes. The findings highlight significant market fragmentation across all AI modalities. While established players like OpenAI and Anthropic maintain dominant positions in text generation, newer entrants such as DeepSeek (in text) and Black Forest Labs (in image generation) have quickly captured meaningful market share, suggesting a dynamic ecosystem despite massive investments flowing toward industry leaders. Here are the five most surprising takeaways from Poe’s analysis of the early 2025 AI ecosystem. A chart tracking AI model usage on Poe during 2024-2025 shows OpenAI’s GPT-4o and Anthropic’s Claude models dominating the text generation market, while newcomers like DeepSeek have begun to capture meaningful market share. (Credit: Poe) 1. Google shows uneven performance across AI modalities Google’s varied performance across different AI modalities reveals the challenges of achieving cross-modal leadership. Its Gemini family of text models “saw growing message share through October 2024,” but has been “declining since” despite substantial investment and technical capabilities. This contrasts sharply with Google’s performance in other categories. In image generation, Google’s Imagen3 family has secured an impressive 30% market share, while in video generation, its Veo-2 model has rapidly captured 40% of messages. This mixed performance suggests that technical excellence alone doesn’t guarantee market leadership. For enterprise decision-makers, this underscores the importance of evaluating AI capabilities on a modality-by-modality basis rather than assuming leadership in one area translates to excellence across all AI capabilities. 2. Video generation experiences high-velocity competition Video generation, the newest frontier in generative AI, has already witnessed intense competition and rapidly shifting leadership positions. According to the report, “The video generation category, while only existing starting in late 2024, has rapidly expanded to more than eight providers now offering diverse options to subscribers.” Google’s Veo-2 model (yellow) emerged in February 2025 to capture 39.8% of video generation messages, rapidly displacing early leader Runway (blue), which fell to 31.6% despite its first-mover advantage. (Credit: Poe) Runway, an early pioneer, “has maintained a strong position with 30 to 50% of video gen messages” despite having only a single API model. However, Google’s entrance has immediately disrupted the status quo: “Google’s Veo-2, since its recent launch on Poe, rapidly captured nearly 40% of total video gen messages in just a few weeks.” Chinese-developed models collectively account for approximately 15% of video generation messages. Models like “Kling-Pro-v1.5, Hailuo-AI, HunyuanVideo and Wan-2.1 continue to push the frontier on capabilities, inference time and cost,” demonstrating that international competition remains a significant factor in driving innovation despite geopolitical tensions. 3. Image generation undergoes radical transformation The image generation field demonstrates perhaps the most dramatic market shift in gen AI, with established players rapidly losing ground to newcomers. “First-mover image gen models like Dall-E-3 and various Stable Diffusion versions were pioneers in the space, but have seen their relative usage share drop nearly 80% as the number of official image gen models has grown from 3 to ~25,” the report states. Black Forest Labs emerged as the surprise leader: “Black Forest Labs’s Flux family of image generation models burst onto the scene in mid 2024 and has maintained its dominant position as the clear frontrunner since, capturing close to 40% of messages.” This represents a remarkable achievement for a relative newcomer against established competitors with vast resources. The image generation market underwent a complete reversal from early 2024 to 2025, with Black Forest Labs’ Flux models and Google’s Imagen3 displacing early leader Dall-E-3, according to Poe usage data. (Credit: Poe) Google’s strategic investment in image generation is also bearing fruit, with “Google’s Imagen3 family…on a steady growth since its late 2024 launch, carving out almost 30% usage share.” This positions Google as a strong second-place contender despite its later market entry. Poe’s data reveals a concerning trend for AI companies investing heavily in maintaining older models: “As frontier labs release more capable models, usage of the new flagship model in a provider’s offering quickly cannibalizes the older versions.” This pattern manifests across companies, with users rapidly abandoning GPT-4 for GPT-4o and Claude-3 for Claude 3.5. The implication is clear: Maintaining backward compatibility and support for legacy models may have diminishing returns as users consistently migrate to the newest offerings. Companies may need to reconsider their product lifecycle strategies, potentially focusing resources on fewer models with more frequent updates rather than maintaining extensive families of offerings with varying capabilities and price points. 5. Text AI duopoly faces new challengers OpenAI and Anthropic maintain dominance in text generation, but face increasing pressure from newer entrants. According to Poe’s data, “text usage across OpenAI and Anthropic models has been nearly equal, showcasing growing competition in the highly expressive text modality” since Claude 3.5 Sonnet’s launch in June 2024. Together, these two companies command approximately 85% of text interactions on the platform. Anthropic’s rapid ascension to parity with OpenAI suggests that quality and capability improvements can quickly translate to market share shifts, even in a field with strong network effects and first-mover advantages. More intriguing is DeepSeek’s emergence as a legitimate third contender. The report notes that “DeepSeek-R1 and V3 went from no usage in December 2024 to gain 7% of messages at their peak, a significantly higher level than any previous open-source model family, such as Llama and Mistral.” This dramatic rise indicates that barriers

Major AI market share shift revealed: DALL-E plummets 80% as Black Forest Labs dominates 2025 data Read More »

LEGO® FORMULA 1® 1:1法拉利跑車模型及全新系列展示牆登陸AIRSIDE

LEGO Group今年首度與Formula 1® 官方正式合作推出20款LEGO® Formula 1® 盒組,完美復刻10支Formula 1® 車隊,全球討論度極高,早前出現在香港各區的1:1 LEGO®  F1® 法拉利跑車模型巡迴展示更引來全城哄動,所到之處都閃光燈不斷;完成巡展後,1:1 LEGO® F1® 法拉利跑車模型立即踩盡油門,極速駛入合作夥伴啟德AIRSIDE《LEGO® Formula 1® 飆速「拼」戰》!於2025年3月14日至4月7日,大家可以親睹1:1 LEGO® F1® 法拉利跑車模型的氣勢和功藝之外,現場亦設有多個互動區域及遊戲,考驗各位的創意和身手,一同挑戰極限;到訪所有遊樂區並集齊指定印章,更可換領限定禮物!期間限定店內發售多款聯乘精品,包括最新推出的LEGO® Formula 1® 拼砌盒組,更有機會得到限量Formula 1® 賽車手或車隊主席等親筆簽名版本,一眾LEGO® 迷和車迷絕對不能錯過!   《LEGO® FORMULA 1® 飆速「拼」戰》以FORMULA 1® 賽道為場地設計,場內最矚目的絕對是1:1 巨型LEGO® F1® 法拉利跑車模型,以法拉利的SF-24 FORMULA 1® 賽車為藍本,並由本地唯一樂高® 專業認證大師(LEGO® Certified Professional)洪子健(Andy)主理,全車採用了20多萬顆樂高顆粒,設計費時超過336小時,用了約1,300 個小時拼砌而成,重達500公斤,以最像真的方式詮釋了汽車的特色細節、功能和技術,從拼砌轉向系統、二速變速箱與印花輪胎、可上下調整擾流板、以至可旋轉MGU-H(熱能回收系統)的 V6 引擎等,每個細節均一絲不苟,令人嘆為觀止! LEGO Group 更獲FORMULA 1® 官方全力支持的「STEM RACING™ Hong Kong」合作打造8米長極速賽道,由鋁合金和複合板等高品質材料製成以確保其耐用性及表面足夠光滑,賽道配備計時感應器、氣動發射系統和安全護欄,配以由「STEM RACING™ Hong Kong」改裝的 LEGO® Speed Champions賽車於賽道上於場內飛馳,完好地呈現出真實賽車的情況;另外,場內展示Air Trace Visualisation Tunnel,是一種使用煙霧或蒸氣以視覺來追蹤模型車周圍氣流模式的工具,用以研究空氣動力學性能,識別阻力和升力,並改進車輛設計以提高速度和穩定性。兩個精妙的細節讓大眾更能切身感受到FORMULA 1® 的氛圍。 《LEGO® FORMULA 1® 飆速「拼」戰》亦有多個LEGO® 拼砌和FORMULA 1® 的元素,包括1米高的LEGO® 車手和技師、全新LEGO® FORMULA 1® 系列產品展示牆、FORMULA 1® 賽車手展品等等,值得一家大細用心觀賞、拍照留念,並打卡傳揚開去! 《LEGO® FORMULA 1® 飆速「拼」戰》遊樂區     互動遊戲考驗創意和身手 《LEGO® Formula 1® 飆速「拼」戰》設有多個互動區域及遊戲,讓大家齊齊挑戰極限,認識賽車運動!只要到訪所有遊樂區並集齊指定印章,即可換領LEGO® F1® 賽車限定禮物包一份,包括賽車維修站通行證(Pitpass)及LEGO® 產品現金劵!下載AIRSIDE 「NF Touch」手機應用程式更可免費換領LEGO® F1® 賽車限定貼紙​。 (禮品換領數量有限,送完即止。)   LinkedIn Email Facebook Twitter WhatsApp source

LEGO® FORMULA 1® 1:1法拉利跑車模型及全新系列展示牆登陸AIRSIDE Read More »

Cohere targets global enterprises with new highly multilingual Command A model requiring only 2 GPUs

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Canadian AI startup Cohere — cofounded by one of the authors of the original transformer paper that kickstarted the large language model (LLM) revolution back in 2017 — today unveiled Command A, its latest generative AI model designed for enterprise applications. As the successor to Command-R, which debuted in March 2024, and Command R+ following it, Command A builds on Cohere’s focus on retrieval-augmented generation (RAG), external tool use and enterprise AI efficiency — especially with regards to compute and the speed at which it serves up answers. That’s going to make it an attractive option for enterprises looking to gain an AI advantage without breaking the bank, and for applications where prompt responses are needed — such as finance, health, medicine, science and law. With faster speeds, lower hardware requirements and expanded multilingual capabilities, Command A positions itself as a strong alternative to models such as GPT-4o and DeepSeek-V3 — classic LLMs, not the new reasoning models that have taken the AI industry by storm lately. Unlike its predecessor, which supported a context length of 128,000 tokens (referencing the amount of information the LLM can handle in one input/output exchange, about equivalent to a 300-page novel), Command A doubles the context length to 256,000 tokens (equivalent to 600 pages of text) while improving overall efficiency and enterprise readiness. It also comes on the heels Cohere for AI — the non-profit subsidiary of the company — releasing an open-source (for research only) multilingual vision model called Aya Vision earlier this month. A step up from Command-R When Command-R launched in early 2024, it introduced key innovations like optimized RAG performance, better knowledge retrieval and lower-cost AI deployments. It gained traction with enterprises, integrating into business solutions from companies like Oracle, Notion, Scale AI, Accenture and McKinsey, though a November 2024 report from Menlo Ventures surveying enterprise adoption put Cohere’s market share among enterprises at a slim 3%, far below OpenAI (34%), Anthropic (24%), and even small startups like Mistral (5%). Now, in a bid to become a bigger enterprise draw, Command A pushes these capabilities even further. According to Cohere, it: Matches or outperforms OpenAI’s GPT-4o and DeepSeek-V3 in business, STEM and coding tasks Operates on just two GPUs (A100 or H100), a major efficiency improvement compared to models that require up to 32 GPUs Achieves faster token generation, producing 156 tokens per second — 1.75x faster than GPT-4o and 2.4x faster than DeepSeek-V3 Reduces latency, with a 6,500ms time-to-first-token, compared to 7,460ms for GPT-4o and 14,740ms for DeepSeek-V3 Strengthens multilingual AI capabilities, with improved Arabic dialect matching and expanded support for 23 global languages. Cohere notes in its developer documentation online that: “Command A is Chatty. By default, the model is interactive and optimized for conversation, meaning it is verbose and uses markdown to highlight code. To override this behavior, developers should use a preamble which asks the model to simply provide the answer and to not use markdown or code block markers.” Built for the enterprise Cohere has continued its enterprise-first strategy with Command A, ensuring that it integrates seamlessly into business environments. Key features include: Advanced retrieval-augmented generation (RAG): Enables verifiable, high-accuracy responses for enterprise applications Agentic tool use: Supports complex workflows by integrating with enterprise tools North AI platform integration: Works with Cohere’s North AI platform, allowing businesses to automate tasks using secure, enterprise-grade AI agents Scalability and cost efficiency: Private deployments are up to 50% cheaper than API-based access. Multilingual and highly performant in Arabic A standout feature of Command A is its ability to generate accurate responses across 23 of the most spoken languages around the world, including improved handling of Arabic dialects. Supported languages (according to the developer documentation on Cohere’s website) are: English French Spanish Italian German Portuguese Japanese Korean Chinese Arabic Russian Polish Turkish Vietnamese Dutch Czech Indonesian Ukrainian Romanian Greek Hindi Hebrew Persian In benchmark evaluations: Command A scored 98.2% accuracy in responding in Arabic to English prompts — higher than both DeepSeek-V3 (94.9%) and GPT-4o (92.2%). It significantly outperformed competitors in dialect consistency, achieving an ADI2 score of 24.7, compared to 15.9 (GPT-4o) and 15.7 (DeepSeek-V3). Credit: Cohere Built for speed and efficiency Speed is a critical factor for enterprise AI deployment, and Command A has been engineered to deliver results faster than many of its competitors. Token streaming speed for 100K context requests: 73 tokens/sec (compared to GPT-4o at 38/sec and DeepSeek-V3 at 32/sec) Faster first token generation: Reduces response time significantly compared to other large-scale models Pricing and availability Command A is now available on the Cohere platform and with open weights for research use only on Hugging Face under a Creative Commons Attribution Non Commercial 4.0 International (CC-by-NC 4.0) license, with broader cloud provider support coming soon. Input tokens: $2.50 per million Output tokens: $10.00 per million Private and on-prem deployments are available upon request. Industry reactions Several AI researchers and Cohere team members have shared their enthusiasm for Command A. Dwaraknath Ganesan, pretraining at Cohere, commented on X: “Extremely excited to reveal what we have been working on for the last few months! Command A is amazing. Can be deployed on just 2 H100 GPUs! 256K context length, expanded multilingual support, agentic tool use… very proud of this one.” Pierre Richemond, AI researcher at Cohere, added: “Command A is our new GPT-4o/DeepSeek v3 level, open-weights 111B model sporting a 256K context length that has been optimized for efficiency in enterprise use cases.” Building on the foundation of Command-R, Cohere’s Command A represents the next step in scalable, cost-efficient enterprise AI. With faster speeds, a larger context window, improved multilingual handling and lower deployment costs, it offers businesses a powerful alternative to existing AI models. source

Cohere targets global enterprises with new highly multilingual Command A model requiring only 2 GPUs Read More »

Billions of Devices at Risk of Hacking Due to Hidden Commands

Tarlogic team giving their presentation during RootedCON. Image: Tarlogic Billions of devices worldwide rely on a widely used Bluetooth-Wi-Fi chip that contains undocumented “hidden commands.” Researchers warn these commands could be exploited to manipulate memory, impersonate devices, and bypass security controls. ESP32, manufactured by a Chinese company called Espressif, is a microcontroller that enables Bluetooth and Wi-Fi connections in numerous smart devices, including smartphones, laptops, smart locks, and medical equipment. Its popularity is partly due to its low cost, with units available for just a few dollars. Must-read security coverage Hidden Bluetooth commands and potential exploits Researchers at security firm Tarlogic discovered 29 undocumented Host Controller Interface commands within the ESP32’s Bluetooth firmware. These commands enable low-level control over some Bluetooth functions, such as reading and writing memory, modifying MAC addresses, and injecting malicious packets, according to Bleeping Computer, which attended Tarlogic’s presentation at RootedCON. SEE: Zscaler Report: Mobile, IoT, and OT Cyber Threats Surged in 2024 While these functions aren’t inherently malicious, bad actors could exploit them to stage impersonation attacks, introduce and hide backdoors, or modify device behavior — all while bypassing code audit controls. Such incidents could lead to a supply chain attack targeting other smart devices. “Malicious actors could impersonate known devices to connect to mobile phones, computers and smart devices, even if they are in offline mode,” the Tarlogic researchers wrote in a blog post. “For what purpose? To obtain confidential information stored on them, to have access to personal and business conversations, and to spy on citizens and companies.” What are the barriers to entry for these exploits? Despite the risks, there are barriers to entry for exploiting these commands, which distinguishes them from typical backdoor vulnerabilities. Attackers would need physical access to the smart device’s USB or UART interface, or they would need to have already compromised the firmware through stolen root access, pre-installed malware, or other vulnerabilities to exploit the commands remotely. What happens next? Tarlogic researchers Miguel Tarascó Acuña and Antonio Vázquez Blanco discovered the vulnerable HCI commands using BluetoothUSB, a free hardware-independent, cross-platform tool that enables access to Bluetooth traffic for security audits and testing. These hidden commands are likely hardware-debugging Opcode instructions that were unintentionally left exposed; TechRepublic has contacted Espressif to confirm but the company has yet to respond as of writing. The company’s response will be crucial in determining whether firmware updates or mitigations will be released to secure affected devices. source

Billions of Devices at Risk of Hacking Due to Hidden Commands Read More »

CRM Database Explained: Definition, Benefits & Best Practices

What is CRM database A customer relationship management (CRM) database is a resource that contains all customer data that all departments within an organization can access. These data include customer information, sales reports, and marketing statistics. Companies use all this information to track, evaluate, and direct customer interactions. 1 monday CRM Employees per Company Size Micro (0-49), Small (50-249), Medium (250-999), Large (1,000-4,999), Enterprise (5,000+) Any Company Size Any Company Size Features Calendar, Collaboration Tools, Contact Management, and more 2 Pipedrive CRM Employees per Company Size Micro (0-49), Small (50-249), Medium (250-999), Large (1,000-4,999), Enterprise (5,000+) Any Company Size Any Company Size Features Calendar, Collaboration Tools, Contact Management, and more 3 HubSpot CRM Employees per Company Size Micro (0-49), Small (50-249), Medium (250-999), Large (1,000-4,999), Enterprise (5,000+) Micro (0-49 Employees), Small (50-249 Employees), Medium (250-999 Employees), Large (1,000-4,999 Employees) Micro, Small, Medium, Large What makes a good CRM database A good CRM database allows you to collect a broad spectrum of customer information and store it in a centralized location. These data include the following: Personal details: Customer’s name, address, and phone number Lead source: The channel through which you acquired the lead or customer Online behavior: The last time a customer visited your business website or engaged with your online content Customer interactions: A customer’s interactions with your sales and support team Other information: Miscellaneous information that you can use for personalization, such as a customer’s hobbies, favorite pet, or preferred brands Key features of a good CRM database A good CRM database offers features that will help businesses organize their leads and contacts, personalize customer engagement, and generate meaningful insights from relevant data. These key CRM features include the following: Contact management: Collects, stores, and enriches customer data from various sources Lead management: Automatically assigns new leads to a sales rep, allowing users to set follow-up reminders and monitor a lead’s progress through the sales pipeline Contact segmentation: Segments customers into various groups based on specific criteria and parameters, such as demographics and purchase history, to aid in content personalization Analytics and reporting: Tracks relevant sales and marketing metrics to generate meaningful insights Integrations: Supports integration with other apps or tools in your technology stack, such as email marketing software, productivity tools, social media channels, and e-commerce platforms How does a CRM database work A CRM database interacts with other features of the CRM system where it is a part of. For instance, you can use stored information to personalize content for your email marketing campaigns. In addition, it logs all customer interactions and feedback so that your support team can assist them based on their latest request. Modern CRM database systems automatically collect data and enrich them from various online sources to eliminate inconsistencies and double entries. Centralized access to all these data provides businesses with a holistic view of the customer, eliminating the need to toggle between different platforms to monitor customer behavior and interactions. 3 types of CRM database CRM database applications fall into three categories depending on the type of data they process and their distinct purposes: Operational CRM database: Automates and optimizes customer-facing processes to streamline sales, marketing, and customer service operations. Collaborative CRM database: Enhances communication and collaboration among internal teams and external partners to streamline information sharing and maintain consistent customer interactions. Analytical CRM database: Analyzes customer data and extracts valuable insights from it to help businesses build effective marketing campaigns and raise customer satisfaction. CRM database use cases A CRM database is a valuable tool for businesses that interact with customers. Specifically, here are its use cases for business operations: Sales The CRM database serves as a resource for identifying qualified sales leads and tracking data points. It can help streamline sales operations through custom workflows that trigger sales activities such as record updates, email sends, and task assignments. Image: monday CRM Marketing CRM database systems provide marketing reps with information they can use to build targeted campaigns. In addition, they track your audiences’ responses to your email and marketing campaigns and provide insights into what your leads are interested in. Image: Pipedrive Customer service Support reps can quickly access the history of customer interactions from the centralized CRM database to deliver relevant and effective customer service. They can simply pick up from where the last agent left off, eliminating the need to recap customer questions and requests. Image: HubSpot More CRM coverage 4 Benefits of using CRM database Here are the top benefits your business can have by using a CRM database: Streamlined contact management: A CRM database automates contact management, ensuring all contact information is easily accessible from a centralized location. It also automatically logs communication records relevant to each contact in real time, eliminating the need for reps to key in these updates manually. Centralized communication: Communication management is a breeze since all team members using the CRM can view each customer’s interaction history with the company. This includes customer inquiries, requests, and complaints. You can set reminders and automatically assign leads and deals to agents to ensure timeline follow-ups. Improved customer retention: Marketing teams can segment their leads, contacts, and customers based on demographic and geographic information saved in the CRM database. Then, they can build personalized email and marketing campaigns for each segment. Doing this helps you retain customers by keeping your customers satisfied and interested in what you offer. Data-based decision-making: The CRM database is a valuable source for customer data and analytics. Businesses can gain insights into their sales performance, customer behavior, and market trends based on information in the database. This enables you to make data-based business decisions regarding your future marketing campaigns, product or service development, and human resource allocation. SEE: 8 Benefits of CRM Software for Businesses How to get started with CRM database: Data migration Customer data comes from various sources, which include emails, social media, and business websites. While a CRM database makes it easy for you to gather all these data and access them from a single platform, its setup and maintenance can be daunting. To guide you through CRM data migration, I’ve broken down the process into

CRM Database Explained: Definition, Benefits & Best Practices Read More »

European cloud hosts offer an escape from AWS, Azure, and GCP

When the modern-day internet began emerging in the early 2000s, finding hosting services and resources to run the new wave of dynamic web applications was hard. You needed a database to store application data. These were slow, expensive, and unreliable, regularly bringing applications to a grinding halt when a single instance failed. You needed a server to run interpreted languages like PHP, Python, or Ruby. These were equally expensive, often needed configuration, had security issues, and frequently ran out of memory or CPU resources, again bringing applications to a grinding halt. For anyone on a small budget, running web 2.0-era applications required constant configuration tweaking, tight performance streamlining, and cost reduction, all within the typically tight confines of what a provider would even let you change and manage yourself. Between those heady days and now, an increasing patchwork of hosting providers emerged to cope with the complexity and scale that web applications demanded. For the past 10 years, a significant proportion of applications have moved to a new generation called “cloud hosting”. The term “cloud” is a bit vague, and there’s a popular (but not altogether accurate) phrase that says, “The cloud is just someone else’s computer”. The cloud abstracts and simplifies the complexity of managing the infrastructure mentioned above. Instead of thinking about servers, you think of services and instances of services. In the modern infrastructure world, when a database is struggling, you add another instance. The 💜 of EU tech The latest rumblings from the EU tech scene, a story from our wise ol’ founder Boris, and some questionable AI art. It’s free, every week, in your inbox. Sign up now! If you have so many database and application instances that you’ve lost track of what’s happening, add another service or three for that, too. Taking this abstraction to an extreme, “serverless” has reached its peak popularity in the past few years. This approach aims to reduce servers and services to something more like a function call. Of course, a server still handles all these function calls and responses behind the scenes, but the argument is that you shouldn’t need to worry about that and should only focus on sending and receiving data. More than 20 years later, web-based application developers’ lives are surely easier, aren’t they? No, not really. There are many issues with developing and maintaining apps that run in the cloud. Thankfully, several European operators are trying to make developers’ lives easier again. Before getting to them, here’s a quick terminology guide. Private cloud: Services used by only one customer. Public cloud: Services shared by more than one customer. In both cases, customer data and details remain private, and everything could run in one or more locations. The main difference is that the provider carves out a digital tranche of territory just for that customer. This is probably defined in software, but it could be in hardware, and it could be a dedicated server running remotely or locally to the customer. With that in mind, let’s dig into the problems in the cloud computing world. The cloud is consolidated and monopolised Cloud computing has hundreds of providers, yet most people only think of three: Amazon Web Services (AWS), Microsoft Azure (Azure), and Google Cloud Platform (GCP) — known as the “hyperscalers” of the hosting industry. The web is a big place, brimming with public and privately available sites, so precise numbers of what runs where are hard to come by. However, according to statistics from builtwith.com, about 12% of websites — approximately 86.8 million in total — run on AWS. The other two “only” host roughly another 12% combined. If you look at hosting companies that call themselves “cloud”, then according to techjury.net these percentages increase to 32% for AWS, 23% for Azure, and 10% for GCP. Yet with these statistics, defining what constitutes a website is complicated. Hyperscalers offer hundreds of different services that developers use for one or more parts of an application, some of which perform crucial functions that break an application if unavailable. This has caused problems in the past. Remember the various times when large amounts of online services were unavailable? That was probably due to one of these major companies experiencing an outage. This has led to many developers taking a multi-cloud or hybrid-cloud approach with their applications, spreading risk by hosting services across multiple providers. This solves a technical issue but brings more revenue to all cloud providers and increases complexity. This consolidation puts a tremendous amount of power into a handful of companies. If they change their policies, thousands of businesses could be left without a place to run. More concerning is that all of the top three — in fact, all of the top five — are US companies, except for Alibaba, based in China. The US already has data privacy, security, and law enforcement policies that concern many companies and jurisdictions, and while all the companies mentioned provide hosting options in a global variety of jurisdictions, what if politics in the US no longer respected these digital borders? No matter how unlikely some things can seem, consolidation is always dangerous. Diversifying the cloud Developers and their companies do not want to completely switch away from the cloud. Rather, they are looking for new options from the hyperscale hosts, especially in Europe, where there is a mixture of increased regulations and insecurity around using American services, alongside a degree of nationalism encouraging people to use European services. These trends create new global opportunities for alternative hosting providers, new and old, especially in Europe. I spoke to three of the largest hosting providers in Europe to find out if they are noticing the same trends and what they think the next 20 years of web hosting might look like. Two of them — France’s OVH (the host of around 4% of websites) and Germany’s Hetzner (around 5.5% of websites) — have existed since the late 1990s, before the web 2.0 revolution and “cloud” was a term. The

European cloud hosts offer an escape from AWS, Azure, and GCP Read More »