• كيما تقولوا "الذكاء الاصطناعي موجود في كل بلاصة!"، اليوم حبيت نحكي معاكم على Boti، المساعد الذكي لي راح يبدل طريقة وصول المواطنين في بوينس آيرس للمعلومات الحكومية.

    هذا المساعد تم تطويره من طرف حكومة المدينة و GenAIIC، والهدف هو يسهل علينا الاستفسار عن الإجراءات الحكومية بطريقة سريعة وآمنة. عنده نظام حراسة يدخل في العملية باش يحميه من أي أسئلة ضارة، وفوق هذا كاين وكيل خاص يجمع المعلومات الضرورية ويرد علينا بأجوبة واضحة.

    بالصراحة، التكنولوجيا كيفاش قادرة تبدل حياتنا اليومية، يشجعني نفكر في كيفاش نقدر نستعملها في خدماتنا اليومية. لو كان عندنا مساعدة كيما Boti في الجزائر، راني نتصور وش رح تكون الفائدة.

    أشوفو التفاصيل هنا:
    https://aws.amazon.com/blogs/machine-learning/meet-boti-the-ai-assistant-transforming-how-the-citizens-of-buenos-aires-access-government-information-with-amazon-bedrock/

    #الذكاء_الاصطناعي #بو
    🌟 كيما تقولوا "الذكاء الاصطناعي موجود في كل بلاصة!"، اليوم حبيت نحكي معاكم على Boti، المساعد الذكي لي راح يبدل طريقة وصول المواطنين في بوينس آيرس للمعلومات الحكومية. 🎉 هذا المساعد تم تطويره من طرف حكومة المدينة و GenAIIC، والهدف هو يسهل علينا الاستفسار عن الإجراءات الحكومية بطريقة سريعة وآمنة. عنده نظام حراسة يدخل في العملية باش يحميه من أي أسئلة ضارة، وفوق هذا كاين وكيل خاص يجمع المعلومات الضرورية ويرد علينا بأجوبة واضحة. بالصراحة، التكنولوجيا كيفاش قادرة تبدل حياتنا اليومية، يشجعني نفكر في كيفاش نقدر نستعملها في خدماتنا اليومية. لو كان عندنا مساعدة كيما Boti في الجزائر، راني نتصور وش رح تكون الفائدة. أشوفو التفاصيل هنا: https://aws.amazon.com/blogs/machine-learning/meet-boti-the-ai-assistant-transforming-how-the-citizens-of-buenos-aires-access-government-information-with-amazon-bedrock/ #الذكاء_الاصطناعي #بو
    aws.amazon.com
    This post describes the agentic AI assistant built by the Government of the City of Buenos Aires and the GenAIIC to respond to citizens’ questions about government procedures. The solution consists of two primary components: an input guardrail system
    Like
    Love
    Wow
    Sad
    Angry
    893
    · 1 Commentaires ·0 Parts
  • واش راكم خاوتي؟ شفتوا الجديد في عالم التكنولوجيا؟

    الخبر زعمة مهم: الفريق لي ورا Alex، اللي صنعوا أداة معروفة تساعد المبرمجين يستخدموا AI في Xcode، رح ينضموا لـ OpenAI. يعني، الآن OpenAI راهم على وشك يحققوا نقلة نوعية جديدة في عالم البرمجة باعتمادهم على خبرة هاد الفريق.

    بالنسبة لي، هذا يفتح آفاق جديدة لكل المبرمجين، كيفاش يمكن للذكاء الاصطناعي يسهل علينا الحياة ويعطينا أفكار جديدة. كاين بزاف الحوايج لي ما زلنا نكتشفوها في هذا المجال، وانا شخصياً متشوق نشوف وين راح يديونا.

    خلينا نبقوا متابعين، الأمور تتحرك بسرعة في عالم التقنية.

    https://techcrunch.com/2025/09/05/openai-hires-the-team-behind-xcode-coding-assistant-alex-codes/

    #OpenAI #Alex #Xcode #ذكاء_اصطناعي #برمجة
    واش راكم خاوتي؟ شفتوا الجديد في عالم التكنولوجيا؟ الخبر زعمة مهم: الفريق لي ورا Alex، اللي صنعوا أداة معروفة تساعد المبرمجين يستخدموا AI في Xcode، رح ينضموا لـ OpenAI. يعني، الآن OpenAI راهم على وشك يحققوا نقلة نوعية جديدة في عالم البرمجة باعتمادهم على خبرة هاد الفريق. بالنسبة لي، هذا يفتح آفاق جديدة لكل المبرمجين، كيفاش يمكن للذكاء الاصطناعي يسهل علينا الحياة ويعطينا أفكار جديدة. كاين بزاف الحوايج لي ما زلنا نكتشفوها في هذا المجال، وانا شخصياً متشوق نشوف وين راح يديونا. خلينا نبقوا متابعين، الأمور تتحرك بسرعة في عالم التقنية. https://techcrunch.com/2025/09/05/openai-hires-the-team-behind-xcode-coding-assistant-alex-codes/ #OpenAI #Alex #Xcode #ذكاء_اصطناعي #برمجة
    techcrunch.com
    The team behind Alex, which built a popular tool that let devs use AI models within Apple's development suite Xcode, is joining OpenAI.
    Like
    Love
    Wow
    Sad
    Angry
    1KB
    · 1 Commentaires ·0 Parts
  • شفتوا هاد الشي؟ عاودت نشوف مقال عن كيفية استعمال الذكاء الاصطناعي في الجرائم الإلكترونية، والمفاجأة كانت كبيرة! في الأخير، واحد من المجرمين استعمل Claude Code باش يسرق بيانات شخصية من 17 منظمة مختلفة، من الصحة للخدمات الطارئة وحتى المؤسسات الحكومية. بدل ما يستعملوا الفدية التقليدية، هاد المجرم كان يهدد الناس بنشر بياناتهم، والمطالب كانت توصل لأكثر من 500,000 دولار!

    شي عجيب كيف الذكاء الاصطناعي ولى يستعملوه في عمليات معقدة، حتى في اتخاذ القرارات. هاد الشي يخليني نفكر في المخاطر اللي يجيبوها التكنولوجيا الحديثة، وواش لازم نكونوا حذرين أكثر؟

    المقال هذا يستاهل القراية، لأنه ينورنا على جانب مظلم من الابتكارات.

    https://www.schneier.com/blog/archives/2025/09/generative-ai-as-a-cybercrime-assistant.html

    #جرائم_الكترونية #AI #ذكاء_اصطناعي #Cybercrime #Claude
    شفتوا هاد الشي؟ 🤯 عاودت نشوف مقال عن كيفية استعمال الذكاء الاصطناعي في الجرائم الإلكترونية، والمفاجأة كانت كبيرة! في الأخير، واحد من المجرمين استعمل Claude Code باش يسرق بيانات شخصية من 17 منظمة مختلفة، من الصحة للخدمات الطارئة وحتى المؤسسات الحكومية. بدل ما يستعملوا الفدية التقليدية، هاد المجرم كان يهدد الناس بنشر بياناتهم، والمطالب كانت توصل لأكثر من 500,000 دولار! 😱 شي عجيب كيف الذكاء الاصطناعي ولى يستعملوه في عمليات معقدة، حتى في اتخاذ القرارات. هاد الشي يخليني نفكر في المخاطر اللي يجيبوها التكنولوجيا الحديثة، وواش لازم نكونوا حذرين أكثر؟ المقال هذا يستاهل القراية، لأنه ينورنا على جانب مظلم من الابتكارات. https://www.schneier.com/blog/archives/2025/09/generative-ai-as-a-cybercrime-assistant.html #جرائم_الكترونية #AI #ذكاء_اصطناعي #Cybercrime #Claude
    www.schneier.com
    Anthropic reports on a Claude user: We recently disrupted a sophisticated cybercriminal that used Claude Code to commit large-scale theft and extortion of personal data. The actor targeted at least 17 distinct organizations, including in healthcare,
    Like
    Love
    Wow
    Sad
    Angry
    678
    · 1 Commentaires ·0 Parts
  • في عصر الذكاء الاصطناعي، كاينين تهديدات جديدة تخضّنا! خاصّة بعد ما شفتوا كيفاش تكنولوجيا LLMs دخلت في حياتنا اليومية، وولاّت تتسخدم في التطبيقات اللي نعتمدوا عليها.

    المقال هذا يستكشف موضوع مثير: "Indirect Prompt Injection Attacks Against LLM Assistants". ببساطة، هذ المقال كيتحدث عن كيفاش ممكن يتم استغلال الذكاء الاصطناعي بطرق خبيثة، وبالخصوص من خلال ما يسمى بـ "Promptware". هذي هجمات غريبة تقدر تسبب مشاكل كبيرة في الأمان، كيما التسريبات أو حتى التحكم في الأجهزة المنزلية.

    شخصيًا، حسيت بخطورة الأمور هذي لما استعملت Google Assistant وبدأت نفكر في كيفاه ممكن يكون عنده تأثيرات سلبية. لازم نكونوا واعيين!

    أدعوكم تقراو المقال وتفكروا في الأمان الرقمي في حياتكم اليومية.

    https://www.schneier.com/blog/archives/2025/09/indirect-prompt-injection-attacks-against-llm-ass
    🔒 في عصر الذكاء الاصطناعي، كاينين تهديدات جديدة تخضّنا! خاصّة بعد ما شفتوا كيفاش تكنولوجيا LLMs دخلت في حياتنا اليومية، وولاّت تتسخدم في التطبيقات اللي نعتمدوا عليها. المقال هذا يستكشف موضوع مثير: "Indirect Prompt Injection Attacks Against LLM Assistants". ببساطة، هذ المقال كيتحدث عن كيفاش ممكن يتم استغلال الذكاء الاصطناعي بطرق خبيثة، وبالخصوص من خلال ما يسمى بـ "Promptware". هذي هجمات غريبة تقدر تسبب مشاكل كبيرة في الأمان، كيما التسريبات أو حتى التحكم في الأجهزة المنزلية. شخصيًا، حسيت بخطورة الأمور هذي لما استعملت Google Assistant وبدأت نفكر في كيفاه ممكن يكون عنده تأثيرات سلبية. لازم نكونوا واعيين! أدعوكم تقراو المقال وتفكروا في الأمان الرقمي في حياتكم اليومية. https://www.schneier.com/blog/archives/2025/09/indirect-prompt-injection-attacks-against-llm-ass
    www.schneier.com
    Really good research on practical attacks against LLM agents. “Invitation Is All You Need! Promptware Attacks Against LLM-Powered Assistants in Production Are Practical and Dangerous” Abstract: The growing integration of LLMs into applica
    Like
    Love
    Wow
    Angry
    Sad
    678
    · 1 Commentaires ·0 Parts
  • يا جماعة، واش راكم؟

    خبر زين جابوه لنا من WhatsApp! دابا، كاين "AI-powered writing assistant" اللي يقدر يساعدكم في كتابة الرسائل، سواء كانت رسمية ولا حتى ضحك. يعني لو كنتو محتاجين نصيحة ولا فكرة لرسالة، تقدروا تستعملوا الأيقونة الجديدة "pencil" في الدردشة.

    شوفوا، رغم أنو الجديد هذا مازال يخرج في أمريكا وبالإنجليزية غير، إلا أنو وعدو يجيو لبلدان ولغات أخرى قريب. بالنسبة لي، هو مفيد في النصوص الطويلة، لكن في المحادثات السريعة، واش رأيكم؟

    المهم، عجبني بلي WhatsApp حافظو على الخصوصية، يعني ما كاينش حد يقرأ رسائلكم.

    تابعوا الجديد وخلينا نفكروا كيفاش نقدروا نستعملوه في حياتنا اليومية.

    https://www.engadget.com/ai/whatsapp-is-the-latest-to-offer-an-ai-powered-writing-assistant-182116369.html?src=rss

    #Whats
    يا جماعة، واش راكم؟ 🖐️ خبر زين جابوه لنا من WhatsApp! دابا، كاين "AI-powered writing assistant" اللي يقدر يساعدكم في كتابة الرسائل، سواء كانت رسمية ولا حتى ضحك. 🎉 يعني لو كنتو محتاجين نصيحة ولا فكرة لرسالة، تقدروا تستعملوا الأيقونة الجديدة "pencil" في الدردشة. شوفوا، رغم أنو الجديد هذا مازال يخرج في أمريكا وبالإنجليزية غير، إلا أنو وعدو يجيو لبلدان ولغات أخرى قريب. بالنسبة لي، هو مفيد في النصوص الطويلة، لكن في المحادثات السريعة، واش رأيكم؟ المهم، عجبني بلي WhatsApp حافظو على الخصوصية، يعني ما كاينش حد يقرأ رسائلكم. 😌 تابعوا الجديد وخلينا نفكروا كيفاش نقدروا نستعملوه في حياتنا اليومية. https://www.engadget.com/ai/whatsapp-is-the-latest-to-offer-an-ai-powered-writing-assistant-182116369.html?src=rss #Whats
    www.engadget.com
    WhatsApp just introduced an AI-powered writing assistant, in case you need help with a text or whatever. The AI provides suggestions in various styles, like professional, funny or supportive. Once generated, the user can continue editing the message
    Like
    Love
    Sad
    Wow
    Angry
    37
    · 1 Commentaires ·0 Parts
  • شحال من مرة حسيت روحك ضايع بين البيانات والمعلومات؟

    مع التطورات الكبيرة في التكنولوجيا، فريق Amazon Finance خدم على حل مبتكر باستخدام الذكاء الاصطناعي مع Amazon Bedrock وAmazon Kendra. الهدف هو مساعدة المحللين في اكتشاف البيانات واستخراج المعطيات اللي يحتاجوها لاتخاذ قرارات مالية أفضل. تخيل معايا كيفاش هالتقنيات تقدر تحدث فرق كبير في الكفاءة التشغيلية وتوحد العمليات عبر العالم!

    شخصياً، نحب كيفاش التكنولوجيا اليوم تسهل علينا الحياة وتخلي الأمور أكثر وضوح. في الوقت اللي كنا نتعبو في تحليل البيانات، الآن عندنا مساعد ذكي يخلصنا من هذه التعب.

    نشجعكم تفكروا في الطرق الجديدة اللي ممكن تستعملوها في عملكم أو دراستكم لتحسين الأداء والاستفادة من التكنولوجيا.

    https://aws.amazon.com/blogs/machine-learning/how-amazon-finance-built-an-ai-assistant-using-amazon-bedrock-and-amazon-kendra-to-support-analysts-for-data-discovery-and-business-insights/
    #تكنولوجيا #ذكاء_اصطناعي #AI #Amazon #تحليل_
    🌟 شحال من مرة حسيت روحك ضايع بين البيانات والمعلومات؟ مع التطورات الكبيرة في التكنولوجيا، فريق Amazon Finance خدم على حل مبتكر باستخدام الذكاء الاصطناعي مع Amazon Bedrock وAmazon Kendra. الهدف هو مساعدة المحللين في اكتشاف البيانات واستخراج المعطيات اللي يحتاجوها لاتخاذ قرارات مالية أفضل. تخيل معايا كيفاش هالتقنيات تقدر تحدث فرق كبير في الكفاءة التشغيلية وتوحد العمليات عبر العالم! شخصياً، نحب كيفاش التكنولوجيا اليوم تسهل علينا الحياة وتخلي الأمور أكثر وضوح. في الوقت اللي كنا نتعبو في تحليل البيانات، الآن عندنا مساعد ذكي يخلصنا من هذه التعب. نشجعكم تفكروا في الطرق الجديدة اللي ممكن تستعملوها في عملكم أو دراستكم لتحسين الأداء والاستفادة من التكنولوجيا. https://aws.amazon.com/blogs/machine-learning/how-amazon-finance-built-an-ai-assistant-using-amazon-bedrock-and-amazon-kendra-to-support-analysts-for-data-discovery-and-business-insights/ #تكنولوجيا #ذكاء_اصطناعي #AI #Amazon #تحليل_
    aws.amazon.com
    The Amazon Finance technical team develops and manages comprehensive technology solutions that power financial decision-making and operational efficiency while standardizing across Amazon’s global operations. In this post, we explain how the team con
    Like
    Love
    Wow
    Sad
    Angry
    267
    · 1 Commentaires ·0 Parts
  • واش رايكم في آخر تحديثات Open AI؟ شفتوا كيفاش التكنولوجيا تتطور بسرعة؟ نسمعوا على GPT 4.5 Turbo و Open AI Assistant API، بصراحة هذي عفسة كبيرة! واش المدهش هو كيفاش راهي تسهل حياتنا اليومية، من المحادثة مع الذكاء الاصطناعي إلى تحسين الإنتاجية.

    لكن هنا نطرح سؤال: هل رايحين نقدروا نثقوا في الذكاء الاصطناعي ونتخلى على بعض الأمور البشرية؟ بعض الناس يخافوا من التأثيرات السلبية، كيما فقدان الوظائف أو غياب الإبداع. شخصيا، نحب التكنولوجيا بس كيما نقول "شحال ما تكون مفيدة، لازم نبقى واعيين".

    ما رأيكم في هذي التطورات؟ واش تخافوا منها ولا تشوفوها فرصة جديدة؟ خلينا نشاركوا الآراء ونشوفوا كيفاش ممكن نستغلوا هذي التحديثات لصالحنا.

    #تكنولوجيا #OpenAI #جزايرنا #ذكاء_اصطناعي #مستقبل
    واش رايكم في آخر تحديثات Open AI؟ شفتوا كيفاش التكنولوجيا تتطور بسرعة؟ نسمعوا على GPT 4.5 Turbo و Open AI Assistant API، بصراحة هذي عفسة كبيرة! واش المدهش هو كيفاش راهي تسهل حياتنا اليومية، من المحادثة مع الذكاء الاصطناعي إلى تحسين الإنتاجية. لكن هنا نطرح سؤال: هل رايحين نقدروا نثقوا في الذكاء الاصطناعي ونتخلى على بعض الأمور البشرية؟ بعض الناس يخافوا من التأثيرات السلبية، كيما فقدان الوظائف أو غياب الإبداع. شخصيا، نحب التكنولوجيا بس كيما نقول "شحال ما تكون مفيدة، لازم نبقى واعيين". ما رأيكم في هذي التطورات؟ واش تخافوا منها ولا تشوفوها فرصة جديدة؟ خلينا نشاركوا الآراء ونشوفوا كيفاش ممكن نستغلوا هذي التحديثات لصالحنا. #تكنولوجيا #OpenAI #جزايرنا #ذكاء_اصطناعي #مستقبل
    Like
    Love
    Wow
    Angry
    Sad
    403
    · 1 Commentaires ·0 Parts
  • يا جماعة، شفتوا الجديد في عالم التكنولوجيا؟

    Microsoft صايرت تتعاون مع Samsung وداخلت Copilot AI في التلفزيونات والمونيتورات تاعهم لعام 2025. يعني تقدروا تطلبوا من Copilot اقتراحات أفلام، ملخصات حلقات من غير حرق، وأي أسئلة عامة. تخيلوا معايا، عندكم مساعد ذكي معاكم في الصالون، ويجيكم بشخصية مرحة وملونة!

    شخصيا، عندي فضول كبير نشوف كيفاش هاد التكنولوجيا رح تغير طريقة شويتنا مع التلفزيون. في الوقت اللي كنا نبحثوا فيه على الشيء، رح يكون عندنا ذاك الصديق الذكي يسهل علينا الأمور.

    فكروا في حياتكم اليومية كيفاش رح يكون عندكم Copilot معاكم، يدخل الفرح والذكاء في المشاهدة!

    https://www.theverge.com/news/767078/microsoft-samsung-tv-copilot-ai-assistant-launch

    #تكنولوجيا #Samsung #Microsoft #AI #التسلية
    يا جماعة، شفتوا الجديد في عالم التكنولوجيا؟ Microsoft صايرت تتعاون مع Samsung وداخلت Copilot AI في التلفزيونات والمونيتورات تاعهم لعام 2025. يعني تقدروا تطلبوا من Copilot اقتراحات أفلام، ملخصات حلقات من غير حرق، وأي أسئلة عامة. تخيلوا معايا، عندكم مساعد ذكي معاكم في الصالون، ويجيكم بشخصية مرحة وملونة! شخصيا، عندي فضول كبير نشوف كيفاش هاد التكنولوجيا رح تغير طريقة شويتنا مع التلفزيون. في الوقت اللي كنا نبحثوا فيه على الشيء، رح يكون عندنا ذاك الصديق الذكي يسهل علينا الأمور. فكروا في حياتكم اليومية كيفاش رح يكون عندكم Copilot معاكم، يدخل الفرح والذكاء في المشاهدة! https://www.theverge.com/news/767078/microsoft-samsung-tv-copilot-ai-assistant-launch #تكنولوجيا #Samsung #Microsoft #AI #التسلية
    www.theverge.com
    Microsoft’s Copilot AI assistant is officially coming to TVs, starting with Samsung’s 2025 lineup of TVs and smart monitors. With the integration, you can call upon Copilot and ask for movie suggestions, spoiler-free episode recaps, and other general
    Like
    Love
    Wow
    Sad
    Angry
    686
    · 1 Commentaires ·0 Parts
  • Elon Musk porte plainte contre Apple et OpenAI qu’il accuse de pratiques anticoncurrentielles

    Elon Musk à Grünheide, en Allemagne, le 22 mars 2022. PATRICK PLEUL/VIA REUTERS Le réseau social X et la start-up xAI, propriétés d’Elon Musk, ont porté plainte, lundi 25 août, contre Apple et OpenAI qu’ils accusent d’avoir formé une alliance illégale pour entraver la concurrence dans le domaine de l’intelligence artificiellegénérative sur les smartphones. Le milliardaire, qui a saisi une cour fédérale du Texas, affirme que le fabricant de l’iPhone et l’éditeur de ChatGPT se sont mis d’accord pour intégrer l’assistant IA aux smartphones d’Apple, tout en écartant des rivaux, comme Grok, l’assistant IA de xAI. « C’est l’histoire de deux monopoles qui unissent leurs forces pour assurer leur domination continue dans un monde désormais propulsé par la technologie la plus puissante jamais créée par l’humanité : l’intelligence artificielle », peut-on lire dans la plainte. X et xAI affirment qu’Apple détient 65 % du marché des smartphones aux Etats-Unis, tandis qu’OpenAI contrôlerait au moins 80 % du marché des assistants d’IA générative, grâce à ChatGPT. Apple et OpenAI ont annoncé leur partenariat en juin 2024, intégrant le célèbre assistant à certaines fonctionnalités de l’iPhone, notamment son assistant vocal Siri. Selon la plainte, cet accord donnerait à ChatGPT un accès exclusif à « des milliards de requêtes d’utilisateurs » provenant de centaines de millions d’appareils. Elon Musk accuse également Apple de manipuler le classement de sa plateforme de téléchargement des applications mobilespour favoriser l’application ChatGPT, tout en retardant l’approbation des mises à jour de l’application Grok. Ses entreprises réclament plusieurs milliards de dollars de dommages et intérêts ainsi qu’une injonction permanente pour mettre fin aux pratiques anticoncurrentielles qu’elles dénoncent. Lire aussi | Article réservé à nos abonnés Intelligence artificielle : les échanges aigres-doux entre Sam Altman et Mark Zuckerberg, nouveaux rivaux de la tech « Campagne acharnée » « Cette nouvelle plainte correspond bien au comportement récurrent de M. Musk, caractérisé par le harcèlement », a réagi un porte-parole d’OpenAI. Apple n’a pas répondu à une sollicitation de l’Agence France-Presse. Elon Musk a fait partie de l’équipe de onze personnes qui a fondé OpenAI en 2015, mais il a quitté l’entreprise en 2018 et ne cesse de l’attaquer sur X et devant les tribunaux depuis le succès phénoménal de ChatGPT à la fin de 2022. Il a entrepris une action en justice contre la star de la Silicon Valley qui aurait, selon lui, trahi ses valeurs, mais a aussi proposé de la racheter. OpenAI a riposté en avril avec une plainte contre le milliardaire, l’accusant de mener une « campagne acharnée » pour lui nuire. Newsletter Newsletter Le Monde Newsletter Suivez-nous sur WhatsApp Ce mois-ci, Elon Musk s’en est pris à Apple : « Apple agit de manière à rendre impossible pour toute entreprise d’IA autre qu’OpenAI d’atteindre la première place sur l’App Store, ce qui constitue une violation manifeste des règles de concurrence », a lancé le milliardaire sur X. Ces accusations ont provoqué une passe d’armes avec Sam Altman, cofondateur et patron d’OpenAI. « C’est une affirmation remarquable, compte tenu de ce que j’ai entendu dire à propos d’Elon, qui manipulerait X pour son propre bénéfice et celui de ses entreprises et pour nuire à ses concurrents ainsi qu’aux personnes qu’il n’apprécie pas », a-t-il réagi sur X. Sam Altman « ment comme il respire », a renchéri Elon Musk, qualifiant son message de « connerie ». Lire aussi | Article réservé à nos abonnés De Mark Zuckerberg à Elon Musk, le « boys club » de Palo Alto Le Monde avec AFP Réutiliser ce contenu
    #elon #musk #porte #plainte #contre
    Elon Musk porte plainte contre Apple et OpenAI qu’il accuse de pratiques anticoncurrentielles
    Elon Musk à Grünheide, en Allemagne, le 22 mars 2022. PATRICK PLEUL/VIA REUTERS Le réseau social X et la start-up xAI, propriétés d’Elon Musk, ont porté plainte, lundi 25 août, contre Apple et OpenAI qu’ils accusent d’avoir formé une alliance illégale pour entraver la concurrence dans le domaine de l’intelligence artificiellegénérative sur les smartphones. Le milliardaire, qui a saisi une cour fédérale du Texas, affirme que le fabricant de l’iPhone et l’éditeur de ChatGPT se sont mis d’accord pour intégrer l’assistant IA aux smartphones d’Apple, tout en écartant des rivaux, comme Grok, l’assistant IA de xAI. « C’est l’histoire de deux monopoles qui unissent leurs forces pour assurer leur domination continue dans un monde désormais propulsé par la technologie la plus puissante jamais créée par l’humanité : l’intelligence artificielle », peut-on lire dans la plainte. X et xAI affirment qu’Apple détient 65 % du marché des smartphones aux Etats-Unis, tandis qu’OpenAI contrôlerait au moins 80 % du marché des assistants d’IA générative, grâce à ChatGPT. Apple et OpenAI ont annoncé leur partenariat en juin 2024, intégrant le célèbre assistant à certaines fonctionnalités de l’iPhone, notamment son assistant vocal Siri. Selon la plainte, cet accord donnerait à ChatGPT un accès exclusif à « des milliards de requêtes d’utilisateurs » provenant de centaines de millions d’appareils. Elon Musk accuse également Apple de manipuler le classement de sa plateforme de téléchargement des applications mobilespour favoriser l’application ChatGPT, tout en retardant l’approbation des mises à jour de l’application Grok. Ses entreprises réclament plusieurs milliards de dollars de dommages et intérêts ainsi qu’une injonction permanente pour mettre fin aux pratiques anticoncurrentielles qu’elles dénoncent. Lire aussi | Article réservé à nos abonnés Intelligence artificielle : les échanges aigres-doux entre Sam Altman et Mark Zuckerberg, nouveaux rivaux de la tech « Campagne acharnée » « Cette nouvelle plainte correspond bien au comportement récurrent de M. Musk, caractérisé par le harcèlement », a réagi un porte-parole d’OpenAI. Apple n’a pas répondu à une sollicitation de l’Agence France-Presse. Elon Musk a fait partie de l’équipe de onze personnes qui a fondé OpenAI en 2015, mais il a quitté l’entreprise en 2018 et ne cesse de l’attaquer sur X et devant les tribunaux depuis le succès phénoménal de ChatGPT à la fin de 2022. Il a entrepris une action en justice contre la star de la Silicon Valley qui aurait, selon lui, trahi ses valeurs, mais a aussi proposé de la racheter. OpenAI a riposté en avril avec une plainte contre le milliardaire, l’accusant de mener une « campagne acharnée » pour lui nuire. Newsletter Newsletter Le Monde Newsletter Suivez-nous sur WhatsApp Ce mois-ci, Elon Musk s’en est pris à Apple : « Apple agit de manière à rendre impossible pour toute entreprise d’IA autre qu’OpenAI d’atteindre la première place sur l’App Store, ce qui constitue une violation manifeste des règles de concurrence », a lancé le milliardaire sur X. Ces accusations ont provoqué une passe d’armes avec Sam Altman, cofondateur et patron d’OpenAI. « C’est une affirmation remarquable, compte tenu de ce que j’ai entendu dire à propos d’Elon, qui manipulerait X pour son propre bénéfice et celui de ses entreprises et pour nuire à ses concurrents ainsi qu’aux personnes qu’il n’apprécie pas », a-t-il réagi sur X. Sam Altman « ment comme il respire », a renchéri Elon Musk, qualifiant son message de « connerie ». Lire aussi | Article réservé à nos abonnés De Mark Zuckerberg à Elon Musk, le « boys club » de Palo Alto Le Monde avec AFP Réutiliser ce contenu #elon #musk #porte #plainte #contre
    Elon Musk porte plainte contre Apple et OpenAI qu’il accuse de pratiques anticoncurrentielles
    www.lemonde.fr
    Elon Musk à Grünheide, en Allemagne, le 22 mars 2022. PATRICK PLEUL/VIA REUTERS Le réseau social X et la start-up xAI, propriétés d’Elon Musk, ont porté plainte, lundi 25 août, contre Apple et OpenAI qu’ils accusent d’avoir formé une alliance illégale pour entraver la concurrence dans le domaine de l’intelligence artificielle (IA) générative sur les smartphones. Le milliardaire, qui a saisi une cour fédérale du Texas, affirme que le fabricant de l’iPhone et l’éditeur de ChatGPT se sont mis d’accord pour intégrer l’assistant IA aux smartphones d’Apple, tout en écartant des rivaux, comme Grok, l’assistant IA de xAI. « C’est l’histoire de deux monopoles qui unissent leurs forces pour assurer leur domination continue dans un monde désormais propulsé par la technologie la plus puissante jamais créée par l’humanité : l’intelligence artificielle », peut-on lire dans la plainte. X et xAI affirment qu’Apple détient 65 % du marché des smartphones aux Etats-Unis, tandis qu’OpenAI contrôlerait au moins 80 % du marché des assistants d’IA générative, grâce à ChatGPT. Apple et OpenAI ont annoncé leur partenariat en juin 2024, intégrant le célèbre assistant à certaines fonctionnalités de l’iPhone, notamment son assistant vocal Siri. Selon la plainte, cet accord donnerait à ChatGPT un accès exclusif à « des milliards de requêtes d’utilisateurs » provenant de centaines de millions d’appareils. Elon Musk accuse également Apple de manipuler le classement de sa plateforme de téléchargement des applications mobiles (App Store) pour favoriser l’application ChatGPT, tout en retardant l’approbation des mises à jour de l’application Grok. Ses entreprises réclament plusieurs milliards de dollars de dommages et intérêts ainsi qu’une injonction permanente pour mettre fin aux pratiques anticoncurrentielles qu’elles dénoncent. Lire aussi | Article réservé à nos abonnés Intelligence artificielle : les échanges aigres-doux entre Sam Altman et Mark Zuckerberg, nouveaux rivaux de la tech « Campagne acharnée » « Cette nouvelle plainte correspond bien au comportement récurrent de M. Musk, caractérisé par le harcèlement », a réagi un porte-parole d’OpenAI. Apple n’a pas répondu à une sollicitation de l’Agence France-Presse. Elon Musk a fait partie de l’équipe de onze personnes qui a fondé OpenAI en 2015, mais il a quitté l’entreprise en 2018 et ne cesse de l’attaquer sur X et devant les tribunaux depuis le succès phénoménal de ChatGPT à la fin de 2022. Il a entrepris une action en justice contre la star de la Silicon Valley qui aurait, selon lui, trahi ses valeurs, mais a aussi proposé de la racheter. OpenAI a riposté en avril avec une plainte contre le milliardaire, l’accusant de mener une « campagne acharnée » pour lui nuire. Newsletter Newsletter Le Monde Newsletter Suivez-nous sur WhatsApp Ce mois-ci, Elon Musk s’en est pris à Apple : « Apple agit de manière à rendre impossible pour toute entreprise d’IA autre qu’OpenAI d’atteindre la première place sur l’App Store, ce qui constitue une violation manifeste des règles de concurrence », a lancé le milliardaire sur X. Ces accusations ont provoqué une passe d’armes avec Sam Altman, cofondateur et patron d’OpenAI. « C’est une affirmation remarquable, compte tenu de ce que j’ai entendu dire à propos d’Elon, qui manipulerait X pour son propre bénéfice et celui de ses entreprises et pour nuire à ses concurrents ainsi qu’aux personnes qu’il n’apprécie pas », a-t-il réagi sur X. Sam Altman « ment comme il respire », a renchéri Elon Musk, qualifiant son message de « connerie ». Lire aussi | Article réservé à nos abonnés De Mark Zuckerberg à Elon Musk, le « boys club » de Palo Alto Le Monde avec AFP Réutiliser ce contenu
    Like
    Love
    Wow
    Angry
    Sad
    645
    · 2 Commentaires ·0 Parts
  • NVIDIA Jetson Thor Unlocks Real-Time Reasoning for General Robotics and Physical AI

    Robots around the world are about to get a lot smarter as physical AI developers plug in NVIDIA Jetson Thor modules — new robotics computers that can serve as the brains for robotic systems across research and industry.
    Robots demand rich sensor data and low-latency AI processing. Running real-time robotic applications requires significant AI compute and memory to handle concurrent data streams from multiple sensors. Jetson Thor, now in general availability, delivers 7.5x more AI compute, 3.1x more CPU performance and 2x more memory than its predecessor, the NVIDIA Jetson Orin, to make this possible on device.
    This performance leap will enable roboticists to process high-speed sensor data and perform visual reasoning at the edge — workflows that were previously too slow to run in dynamic real-world environments. This opens new possibilities for multimodal AI applications such as humanoid robotics.

    Agility Robotics, a leader in humanoid robotics, has integrated NVIDIA Jetson into the fifth generation of its robot, Digit — and plans to adopt Jetson Thor as the onboard compute platform for the sixth generation of Digit. This transition will enhance Digit’s real-time perception and decision-making capabilities, supporting increasingly complex AI skills and behaviors. Digit is commercially deployed and performs logistics tasks such as stacking, loading and palletizing in warehouse and manufacturing environments.
    “The powerful edge processing offered by Jetson Thor will take Digit to the next level — enhancing its real-time responsiveness and expanding its abilities to a broader, more complex set of skills,” said Peggy Johnson, CEO of Agility Robotics. “With Jetson Thor, we can deliver the latest physical AI advancements to optimize operations across our customers’ warehouses and factories.”
    Boston Dynamics — which has been building some of the industry’s most advanced robots for over 30 years — is integrating Jetson Thor into its humanoid robot Atlas, enabling Atlas to harness formerly server-level compute, AI workload acceleration, high-bandwidth data processing and significant memory on device.
    Beyond humanoids, Jetson Thor will accelerate various robotic applications — such as surgical assistants, smart tractors, delivery robots, industrial manipulators and visual AI agents — with real-time inference on device for larger, more complex AI models.
    A Giant Leap for Real-Time Robot Reasoning
    Jetson Thor is built for generative reasoning models. It enables the next generation of physical AI agents — powered by large transformer models, vision language models and vision language action models — to run in real time at the edge while minimizing cloud dependency.
    Optimized with the Jetson software stack to enable the low latency and high performance required in real-world applications, Jetson Thor supports all popular generative AI frameworks and AI reasoning models with unmatched real-time performance. These include Cosmos Reason, DeepSeek, Llama, Gemini and Qwen models, as well as domain-specific models for robotics like Isaac GR00T N1.5, enabling any developer to easily experiment and run inference locally.
    NVIDIA Jetson Thor opens new capabilities for real-time reasoning with multi-sensor input. Further performance improvement is expected with FP4 and speculative decoding optimization.
    With NVIDIA CUDA ecosystem support through its lifecycle, Jetson Thor is expected to deliver even better throughput and faster responses with future software releases.
    Jetson Thor modules also run the full NVIDIA AI software stack to accelerate virtually every physical AI workflow with platforms including NVIDIA Isaac for robotics, NVIDIA Metropolis for video analytics AI agents and NVIDIA Holoscan for sensor processing.
    With these software tools, developers can easily build and deploy applications, such as visual AI agents that can analyze live camera streams to monitor worker safety, humanoid robots capable of manipulation tasks in unstructured environments and smart operating rooms that guide surgeons based on data from multi-camera streams.
    Jetson Thor Set to Advance Research Innovation 
    Research labs at Stanford University, Carnegie Mellon University and the University of Zurich are tapping Jetson Thor to push the boundaries of perception, planning and navigation models for a host of potential applications.
    At Carnegie Mellon’s Robotics Institute, a research team uses NVIDIA Jetson to power autonomous robots that can navigate complex, unstructured environments to conduct medical triage as well as search and rescue.
    “We can only do as much as the compute available allows,” said Sebastian Scherer, an associate research professor at the university and head of the AirLab. “Years ago, there was a big disconnect between computer vision and robotics because computer vision workloads were too slow for real-time decision-making — but now, models and computing have gotten fast enough so robots can handle much more nuanced tasks.”
    Scherer anticipates that by upgrading from his team’s existing NVIDIA Jetson AGX Orin systems to Jetson AGX Thor developer kit, they’ll improve the performance of AI models including their award-winning MAC-VO model for robot perception at the edge, boost their sensor-fusion capabilities and be able to experiment with robot fleets.
    Wield the Strength of Jetson Thor
    The Jetson Thor family includes a developer kit and production modules. The developer kit includes a Jetson T5000 module, a reference carrier board with abundant connectivity, an active heatsink with a fan and a power supply.
    NVIDIA Jetson AGX Thor Developer Kit
    The Jetson ecosystem supports a variety of application requirements, high-speed industrial automation protocols and sensor interfaces, accelerating time to market for enterprise developers. Hardware partners including Advantech, Aetina, ConnectTech, MiiVii and TZTEK are building production-ready Jetson Thor systems with flexible I/O and custom configurations in various form factors.
    Sensor and Actuator companies including Analog Devices, Inc., e-con Systems,  Infineon, Leopard Imaging, RealSense and Sensing are using NVIDIA Holoscan Sensor Bridge — a platform that simplifies sensor fusion and data streaming — to connect sensor data from cameras, radar, lidar and more directly to GPU memory on Jetson Thor with ultralow latency.
    Thousands of software companies can now elevate their traditional vision AI and robotics applications with multi-AI agent workflows running on Jetson Thor. Leading adopters include Openzeka, Rebotnix, Solomon and Vaidio.
    More than 2 million developers use NVIDIA technologies to accelerate robotics workflows. Get started with Jetson Thor by reading the NVIDIA Technical Blog and watching the developer kit walkthrough.

    To get hands-on experience with Jetson Thor, sign up to participate in upcoming hackathons with Seeed Studio and LeRobot by Hugging Face.
    The NVIDIA Jetson AGX Thor developer kit is available now starting at NVIDIA Jetson T5000 modules are available starting at for 1,000 units. Buy now from authorized NVIDIA partners.
    NVIDIA today also announced that the NVIDIA DRIVE AGX Thor developer kit, which provides a platform for developing autonomous vehicles and mobility solutions, is available for preorder. Deliveries are slated to start in September.
    #nvidia #jetson #thor #unlocks #realtime
    NVIDIA Jetson Thor Unlocks Real-Time Reasoning for General Robotics and Physical AI
    Robots around the world are about to get a lot smarter as physical AI developers plug in NVIDIA Jetson Thor modules — new robotics computers that can serve as the brains for robotic systems across research and industry. Robots demand rich sensor data and low-latency AI processing. Running real-time robotic applications requires significant AI compute and memory to handle concurrent data streams from multiple sensors. Jetson Thor, now in general availability, delivers 7.5x more AI compute, 3.1x more CPU performance and 2x more memory than its predecessor, the NVIDIA Jetson Orin, to make this possible on device. This performance leap will enable roboticists to process high-speed sensor data and perform visual reasoning at the edge — workflows that were previously too slow to run in dynamic real-world environments. This opens new possibilities for multimodal AI applications such as humanoid robotics. Agility Robotics, a leader in humanoid robotics, has integrated NVIDIA Jetson into the fifth generation of its robot, Digit — and plans to adopt Jetson Thor as the onboard compute platform for the sixth generation of Digit. This transition will enhance Digit’s real-time perception and decision-making capabilities, supporting increasingly complex AI skills and behaviors. Digit is commercially deployed and performs logistics tasks such as stacking, loading and palletizing in warehouse and manufacturing environments. “The powerful edge processing offered by Jetson Thor will take Digit to the next level — enhancing its real-time responsiveness and expanding its abilities to a broader, more complex set of skills,” said Peggy Johnson, CEO of Agility Robotics. “With Jetson Thor, we can deliver the latest physical AI advancements to optimize operations across our customers’ warehouses and factories.” Boston Dynamics — which has been building some of the industry’s most advanced robots for over 30 years — is integrating Jetson Thor into its humanoid robot Atlas, enabling Atlas to harness formerly server-level compute, AI workload acceleration, high-bandwidth data processing and significant memory on device. Beyond humanoids, Jetson Thor will accelerate various robotic applications — such as surgical assistants, smart tractors, delivery robots, industrial manipulators and visual AI agents — with real-time inference on device for larger, more complex AI models. A Giant Leap for Real-Time Robot Reasoning Jetson Thor is built for generative reasoning models. It enables the next generation of physical AI agents — powered by large transformer models, vision language models and vision language action models — to run in real time at the edge while minimizing cloud dependency. Optimized with the Jetson software stack to enable the low latency and high performance required in real-world applications, Jetson Thor supports all popular generative AI frameworks and AI reasoning models with unmatched real-time performance. These include Cosmos Reason, DeepSeek, Llama, Gemini and Qwen models, as well as domain-specific models for robotics like Isaac GR00T N1.5, enabling any developer to easily experiment and run inference locally. NVIDIA Jetson Thor opens new capabilities for real-time reasoning with multi-sensor input. Further performance improvement is expected with FP4 and speculative decoding optimization. With NVIDIA CUDA ecosystem support through its lifecycle, Jetson Thor is expected to deliver even better throughput and faster responses with future software releases. Jetson Thor modules also run the full NVIDIA AI software stack to accelerate virtually every physical AI workflow with platforms including NVIDIA Isaac for robotics, NVIDIA Metropolis for video analytics AI agents and NVIDIA Holoscan for sensor processing. With these software tools, developers can easily build and deploy applications, such as visual AI agents that can analyze live camera streams to monitor worker safety, humanoid robots capable of manipulation tasks in unstructured environments and smart operating rooms that guide surgeons based on data from multi-camera streams. Jetson Thor Set to Advance Research Innovation  Research labs at Stanford University, Carnegie Mellon University and the University of Zurich are tapping Jetson Thor to push the boundaries of perception, planning and navigation models for a host of potential applications. At Carnegie Mellon’s Robotics Institute, a research team uses NVIDIA Jetson to power autonomous robots that can navigate complex, unstructured environments to conduct medical triage as well as search and rescue. “We can only do as much as the compute available allows,” said Sebastian Scherer, an associate research professor at the university and head of the AirLab. “Years ago, there was a big disconnect between computer vision and robotics because computer vision workloads were too slow for real-time decision-making — but now, models and computing have gotten fast enough so robots can handle much more nuanced tasks.” Scherer anticipates that by upgrading from his team’s existing NVIDIA Jetson AGX Orin systems to Jetson AGX Thor developer kit, they’ll improve the performance of AI models including their award-winning MAC-VO model for robot perception at the edge, boost their sensor-fusion capabilities and be able to experiment with robot fleets. Wield the Strength of Jetson Thor The Jetson Thor family includes a developer kit and production modules. The developer kit includes a Jetson T5000 module, a reference carrier board with abundant connectivity, an active heatsink with a fan and a power supply. NVIDIA Jetson AGX Thor Developer Kit The Jetson ecosystem supports a variety of application requirements, high-speed industrial automation protocols and sensor interfaces, accelerating time to market for enterprise developers. Hardware partners including Advantech, Aetina, ConnectTech, MiiVii and TZTEK are building production-ready Jetson Thor systems with flexible I/O and custom configurations in various form factors. Sensor and Actuator companies including Analog Devices, Inc., e-con Systems,  Infineon, Leopard Imaging, RealSense and Sensing are using NVIDIA Holoscan Sensor Bridge — a platform that simplifies sensor fusion and data streaming — to connect sensor data from cameras, radar, lidar and more directly to GPU memory on Jetson Thor with ultralow latency. Thousands of software companies can now elevate their traditional vision AI and robotics applications with multi-AI agent workflows running on Jetson Thor. Leading adopters include Openzeka, Rebotnix, Solomon and Vaidio. More than 2 million developers use NVIDIA technologies to accelerate robotics workflows. Get started with Jetson Thor by reading the NVIDIA Technical Blog and watching the developer kit walkthrough. To get hands-on experience with Jetson Thor, sign up to participate in upcoming hackathons with Seeed Studio and LeRobot by Hugging Face. The NVIDIA Jetson AGX Thor developer kit is available now starting at NVIDIA Jetson T5000 modules are available starting at for 1,000 units. Buy now from authorized NVIDIA partners. NVIDIA today also announced that the NVIDIA DRIVE AGX Thor developer kit, which provides a platform for developing autonomous vehicles and mobility solutions, is available for preorder. Deliveries are slated to start in September. #nvidia #jetson #thor #unlocks #realtime
    NVIDIA Jetson Thor Unlocks Real-Time Reasoning for General Robotics and Physical AI
    blogs.nvidia.com
    Robots around the world are about to get a lot smarter as physical AI developers plug in NVIDIA Jetson Thor modules — new robotics computers that can serve as the brains for robotic systems across research and industry. Robots demand rich sensor data and low-latency AI processing. Running real-time robotic applications requires significant AI compute and memory to handle concurrent data streams from multiple sensors. Jetson Thor, now in general availability, delivers 7.5x more AI compute, 3.1x more CPU performance and 2x more memory than its predecessor, the NVIDIA Jetson Orin, to make this possible on device. This performance leap will enable roboticists to process high-speed sensor data and perform visual reasoning at the edge — workflows that were previously too slow to run in dynamic real-world environments. This opens new possibilities for multimodal AI applications such as humanoid robotics. Agility Robotics, a leader in humanoid robotics, has integrated NVIDIA Jetson into the fifth generation of its robot, Digit — and plans to adopt Jetson Thor as the onboard compute platform for the sixth generation of Digit. This transition will enhance Digit’s real-time perception and decision-making capabilities, supporting increasingly complex AI skills and behaviors. Digit is commercially deployed and performs logistics tasks such as stacking, loading and palletizing in warehouse and manufacturing environments. “The powerful edge processing offered by Jetson Thor will take Digit to the next level — enhancing its real-time responsiveness and expanding its abilities to a broader, more complex set of skills,” said Peggy Johnson, CEO of Agility Robotics. “With Jetson Thor, we can deliver the latest physical AI advancements to optimize operations across our customers’ warehouses and factories.” Boston Dynamics — which has been building some of the industry’s most advanced robots for over 30 years — is integrating Jetson Thor into its humanoid robot Atlas, enabling Atlas to harness formerly server-level compute, AI workload acceleration, high-bandwidth data processing and significant memory on device. Beyond humanoids, Jetson Thor will accelerate various robotic applications — such as surgical assistants, smart tractors, delivery robots, industrial manipulators and visual AI agents — with real-time inference on device for larger, more complex AI models. A Giant Leap for Real-Time Robot Reasoning Jetson Thor is built for generative reasoning models. It enables the next generation of physical AI agents — powered by large transformer models, vision language models and vision language action models — to run in real time at the edge while minimizing cloud dependency. Optimized with the Jetson software stack to enable the low latency and high performance required in real-world applications, Jetson Thor supports all popular generative AI frameworks and AI reasoning models with unmatched real-time performance. These include Cosmos Reason, DeepSeek, Llama, Gemini and Qwen models, as well as domain-specific models for robotics like Isaac GR00T N1.5, enabling any developer to easily experiment and run inference locally. NVIDIA Jetson Thor opens new capabilities for real-time reasoning with multi-sensor input. Further performance improvement is expected with FP4 and speculative decoding optimization. With NVIDIA CUDA ecosystem support through its lifecycle, Jetson Thor is expected to deliver even better throughput and faster responses with future software releases. Jetson Thor modules also run the full NVIDIA AI software stack to accelerate virtually every physical AI workflow with platforms including NVIDIA Isaac for robotics, NVIDIA Metropolis for video analytics AI agents and NVIDIA Holoscan for sensor processing. With these software tools, developers can easily build and deploy applications, such as visual AI agents that can analyze live camera streams to monitor worker safety, humanoid robots capable of manipulation tasks in unstructured environments and smart operating rooms that guide surgeons based on data from multi-camera streams. Jetson Thor Set to Advance Research Innovation  Research labs at Stanford University, Carnegie Mellon University and the University of Zurich are tapping Jetson Thor to push the boundaries of perception, planning and navigation models for a host of potential applications. At Carnegie Mellon’s Robotics Institute, a research team uses NVIDIA Jetson to power autonomous robots that can navigate complex, unstructured environments to conduct medical triage as well as search and rescue. “We can only do as much as the compute available allows,” said Sebastian Scherer, an associate research professor at the university and head of the AirLab. “Years ago, there was a big disconnect between computer vision and robotics because computer vision workloads were too slow for real-time decision-making — but now, models and computing have gotten fast enough so robots can handle much more nuanced tasks.” Scherer anticipates that by upgrading from his team’s existing NVIDIA Jetson AGX Orin systems to Jetson AGX Thor developer kit, they’ll improve the performance of AI models including their award-winning MAC-VO model for robot perception at the edge, boost their sensor-fusion capabilities and be able to experiment with robot fleets. Wield the Strength of Jetson Thor The Jetson Thor family includes a developer kit and production modules. The developer kit includes a Jetson T5000 module, a reference carrier board with abundant connectivity, an active heatsink with a fan and a power supply. NVIDIA Jetson AGX Thor Developer Kit The Jetson ecosystem supports a variety of application requirements, high-speed industrial automation protocols and sensor interfaces, accelerating time to market for enterprise developers. Hardware partners including Advantech, Aetina, ConnectTech, MiiVii and TZTEK are building production-ready Jetson Thor systems with flexible I/O and custom configurations in various form factors. Sensor and Actuator companies including Analog Devices, Inc. (ADI), e-con Systems,  Infineon, Leopard Imaging, RealSense and Sensing are using NVIDIA Holoscan Sensor Bridge — a platform that simplifies sensor fusion and data streaming — to connect sensor data from cameras, radar, lidar and more directly to GPU memory on Jetson Thor with ultralow latency. Thousands of software companies can now elevate their traditional vision AI and robotics applications with multi-AI agent workflows running on Jetson Thor. Leading adopters include Openzeka, Rebotnix, Solomon and Vaidio. More than 2 million developers use NVIDIA technologies to accelerate robotics workflows. Get started with Jetson Thor by reading the NVIDIA Technical Blog and watching the developer kit walkthrough. To get hands-on experience with Jetson Thor, sign up to participate in upcoming hackathons with Seeed Studio and LeRobot by Hugging Face. The NVIDIA Jetson AGX Thor developer kit is available now starting at $3,499. NVIDIA Jetson T5000 modules are available starting at $2,999 for 1,000 units. Buy now from authorized NVIDIA partners. NVIDIA today also announced that the NVIDIA DRIVE AGX Thor developer kit, which provides a platform for developing autonomous vehicles and mobility solutions, is available for preorder. Deliveries are slated to start in September.
    Like
    Love
    Wow
    Sad
    Angry
    797
    · 2 Commentaires ·0 Parts
  • Metal Gear Solid Delta: Snake Eater hands-on report

    It’s been over two decades since Metal Gear Solid 3: Snake Eater was first released on PlayStation 2. The game was praised for its story, characters, and possibly one of the greatest themes in video game history. After some brumation, it sheds its skin and emerges as Metal Gear Solid Delta: Snake Eater on August 28, aiming to recapture the spirit that made the original a beloved classic. After about eight hours of playing the game on PS5 Pro, I’m thrilled to share how it captures and modernizes the original’s spirit, and then some.

    Play Video

    Delta is a true from-the-ground-up remake that is extremely faithful to the original work in most aspects of the game, but what was immediately apparent was the level of detail the updated visuals and textures add to the experience. 

    A new level of visual fidelity

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    This updated version of Snake Eater is a visual feast on PS5 Pro, especially in the lush details. For example, rain droplets trickle realistically down a poncho, and Snake’s camouflage and uniforms become dirty with mud or forest debris. This filth even carries over into cutscenes, adding an appreciated level of realism.

    The Metal Gear series showcases a range of grizzled warriors, many with scars that tell a tale. If you’re familiar with Snake Eater, you understand that scars hold a lot of importance throughout, and the devs took great care to make them stand out. One of the most notable examples is Colonel Volgin’s harrowingly scarred face. The believable tissue and its deformation when he speaks create a tragically beautiful portrait. 

    Speaking of portraits, a new photo mode has been added with all the latest bells and whistles. Like most Metal Gear games, Delta definitely has its fair share of silly moments, and you can capture them all. With plenty of filters and settings, create a masterpiece on the mountainside, or dress up in a crocodile head and let antics ensue. Photo Mode is the perfect way to capture all the little details hiding within.

    Game controls – New Style vs. Legacy

    A new control scheme has been introduced to bring Snake Eater to the modern gaming era, dubbed New Style. Before starting a new game, players can choose between the New Style and Legacy, which retains the controls mapped after the original PS2 release. You can switch between styles, but be warned, this will reload the level/map and take you back to the beginning of the section.

    New Style is geared for people who have never played the game before, or who might prefer a more modern playstyle. The control option provides a free-moving camera that lets you view your environment in 360 degrees, making it easier to avoid getting lost or having enemies catch you unprepared.

    Combat and shooting feel reminiscent of Metal Gear Solid V, with a third-person over-the-shoulder camera. By default, aim assist is turned on, but can be toggled off. Even in New Style, you can still switch to a classic first-person view and still fully move around as if playing a FPS title. First-person view is especially valuable when lining up the perfect shot through a chainlink fence, which I couldn’t pull off in third-person.

    The biggest saving grace in the updated control scheme is the remapped directional buttons. Holding left brings up your non-combat inventory, and holding right brings up your currently equipped weapons. Up brings up the quick-change camouflage menu, while down brings up your radio —a hugely welcome shortcut. No more digging through menus to change outfits based on your environment.

    Snake sneaks through a range of environments in Snake Eater, each suited to different camouflage options The quick change menu conveniently shows the optimal face and body combo from your collection based on the current environmentIn one instance, I managed to seamlessly transition from a green texture to a stone grey-black getup, then to a rust-colored camouflage, all along the same crawl route. This new quality-of-life option keeps the action flowing.

    Another great accessibility feature is the ability to fine-tune game hints. From always-on to none at all. I had it set to show helpful hints when they were relevant, like swimming controls appearing by a body of water and hanging controls on the cliffside. This is particularly helpful in rare gameplay situations, as it kept me from panicking in high-stress situations. 

    What a thrill

    The voice cast still delivers, and The Cobra Unit is just as compelling, with big moments still having the right impact. The ladder scene took me right back to playing the original on my grandmother’s floor all those years ago. 

    Paradoxes, easter eggs, and all the details I’d expect are still in place. I didn’t encounter any moments that felt off or deviated too far in any way from the script. The opening theme and intro movie have been remixed, and while it will come down to personal taste, every note still hits for me. 

    Metal Gear Solid Delta: Snake Eater launches on August 28 for PS5, and is a day to mark on your calendar whether you’re a longtime fan or series newcomer interested in discovering the celebrated origins of the storyline.  

    Metal Gear Solid Delta: Snake Eater developers discuss the game in length in a new interview.
    #metal #gear #solid #delta #snake
    Metal Gear Solid Delta: Snake Eater hands-on report
    It’s been over two decades since Metal Gear Solid 3: Snake Eater was first released on PlayStation 2. The game was praised for its story, characters, and possibly one of the greatest themes in video game history. After some brumation, it sheds its skin and emerges as Metal Gear Solid Delta: Snake Eater on August 28, aiming to recapture the spirit that made the original a beloved classic. After about eight hours of playing the game on PS5 Pro, I’m thrilled to share how it captures and modernizes the original’s spirit, and then some. Play Video Delta is a true from-the-ground-up remake that is extremely faithful to the original work in most aspects of the game, but what was immediately apparent was the level of detail the updated visuals and textures add to the experience.  A new level of visual fidelity View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image This updated version of Snake Eater is a visual feast on PS5 Pro, especially in the lush details. For example, rain droplets trickle realistically down a poncho, and Snake’s camouflage and uniforms become dirty with mud or forest debris. This filth even carries over into cutscenes, adding an appreciated level of realism. The Metal Gear series showcases a range of grizzled warriors, many with scars that tell a tale. If you’re familiar with Snake Eater, you understand that scars hold a lot of importance throughout, and the devs took great care to make them stand out. One of the most notable examples is Colonel Volgin’s harrowingly scarred face. The believable tissue and its deformation when he speaks create a tragically beautiful portrait.  Speaking of portraits, a new photo mode has been added with all the latest bells and whistles. Like most Metal Gear games, Delta definitely has its fair share of silly moments, and you can capture them all. With plenty of filters and settings, create a masterpiece on the mountainside, or dress up in a crocodile head and let antics ensue. Photo Mode is the perfect way to capture all the little details hiding within. Game controls – New Style vs. Legacy A new control scheme has been introduced to bring Snake Eater to the modern gaming era, dubbed New Style. Before starting a new game, players can choose between the New Style and Legacy, which retains the controls mapped after the original PS2 release. You can switch between styles, but be warned, this will reload the level/map and take you back to the beginning of the section. New Style is geared for people who have never played the game before, or who might prefer a more modern playstyle. The control option provides a free-moving camera that lets you view your environment in 360 degrees, making it easier to avoid getting lost or having enemies catch you unprepared. Combat and shooting feel reminiscent of Metal Gear Solid V, with a third-person over-the-shoulder camera. By default, aim assist is turned on, but can be toggled off. Even in New Style, you can still switch to a classic first-person view and still fully move around as if playing a FPS title. First-person view is especially valuable when lining up the perfect shot through a chainlink fence, which I couldn’t pull off in third-person. The biggest saving grace in the updated control scheme is the remapped directional buttons. Holding left brings up your non-combat inventory, and holding right brings up your currently equipped weapons. Up brings up the quick-change camouflage menu, while down brings up your radio —a hugely welcome shortcut. No more digging through menus to change outfits based on your environment. Snake sneaks through a range of environments in Snake Eater, each suited to different camouflage options The quick change menu conveniently shows the optimal face and body combo from your collection based on the current environmentIn one instance, I managed to seamlessly transition from a green texture to a stone grey-black getup, then to a rust-colored camouflage, all along the same crawl route. This new quality-of-life option keeps the action flowing. Another great accessibility feature is the ability to fine-tune game hints. From always-on to none at all. I had it set to show helpful hints when they were relevant, like swimming controls appearing by a body of water and hanging controls on the cliffside. This is particularly helpful in rare gameplay situations, as it kept me from panicking in high-stress situations.  What a thrill The voice cast still delivers, and The Cobra Unit is just as compelling, with big moments still having the right impact. The ladder scene took me right back to playing the original on my grandmother’s floor all those years ago.  Paradoxes, easter eggs, and all the details I’d expect are still in place. I didn’t encounter any moments that felt off or deviated too far in any way from the script. The opening theme and intro movie have been remixed, and while it will come down to personal taste, every note still hits for me.  Metal Gear Solid Delta: Snake Eater launches on August 28 for PS5, and is a day to mark on your calendar whether you’re a longtime fan or series newcomer interested in discovering the celebrated origins of the storyline.   Metal Gear Solid Delta: Snake Eater developers discuss the game in length in a new interview. #metal #gear #solid #delta #snake
    Metal Gear Solid Delta: Snake Eater hands-on report
    blog.playstation.com
    It’s been over two decades since Metal Gear Solid 3: Snake Eater was first released on PlayStation 2. The game was praised for its story, characters, and possibly one of the greatest themes in video game history. After some brumation, it sheds its skin and emerges as Metal Gear Solid Delta: Snake Eater on August 28, aiming to recapture the spirit that made the original a beloved classic. After about eight hours of playing the game on PS5 Pro, I’m thrilled to share how it captures and modernizes the original’s spirit, and then some. Play Video Delta is a true from-the-ground-up remake that is extremely faithful to the original work in most aspects of the game, but what was immediately apparent was the level of detail the updated visuals and textures add to the experience.  A new level of visual fidelity View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image This updated version of Snake Eater is a visual feast on PS5 Pro, especially in the lush details. For example, rain droplets trickle realistically down a poncho, and Snake’s camouflage and uniforms become dirty with mud or forest debris. This filth even carries over into cutscenes, adding an appreciated level of realism. The Metal Gear series showcases a range of grizzled warriors, many with scars that tell a tale. If you’re familiar with Snake Eater, you understand that scars hold a lot of importance throughout, and the devs took great care to make them stand out. One of the most notable examples is Colonel Volgin’s harrowingly scarred face. The believable tissue and its deformation when he speaks create a tragically beautiful portrait.  Speaking of portraits, a new photo mode has been added with all the latest bells and whistles. Like most Metal Gear games, Delta definitely has its fair share of silly moments, and you can capture them all. With plenty of filters and settings, create a masterpiece on the mountainside, or dress up in a crocodile head and let antics ensue. Photo Mode is the perfect way to capture all the little details hiding within. Game controls – New Style vs. Legacy A new control scheme has been introduced to bring Snake Eater to the modern gaming era, dubbed New Style. Before starting a new game, players can choose between the New Style and Legacy, which retains the controls mapped after the original PS2 release. You can switch between styles, but be warned, this will reload the level/map and take you back to the beginning of the section. New Style is geared for people who have never played the game before, or who might prefer a more modern playstyle. The control option provides a free-moving camera that lets you view your environment in 360 degrees, making it easier to avoid getting lost or having enemies catch you unprepared. Combat and shooting feel reminiscent of Metal Gear Solid V, with a third-person over-the-shoulder camera. By default, aim assist is turned on, but can be toggled off. Even in New Style, you can still switch to a classic first-person view and still fully move around as if playing a FPS title. First-person view is especially valuable when lining up the perfect shot through a chainlink fence, which I couldn’t pull off in third-person. The biggest saving grace in the updated control scheme is the remapped directional buttons. Holding left brings up your non-combat inventory, and holding right brings up your currently equipped weapons. Up brings up the quick-change camouflage menu, while down brings up your radio —a hugely welcome shortcut. No more digging through menus to change outfits based on your environment. Snake sneaks through a range of environments in Snake Eater, each suited to different camouflage options The quick change menu conveniently shows the optimal face and body combo from your collection based on the current environmentIn one instance, I managed to seamlessly transition from a green texture to a stone grey-black getup, then to a rust-colored camouflage, all along the same crawl route. This new quality-of-life option keeps the action flowing. Another great accessibility feature is the ability to fine-tune game hints. From always-on to none at all. I had it set to show helpful hints when they were relevant, like swimming controls appearing by a body of water and hanging controls on the cliffside. This is particularly helpful in rare gameplay situations, as it kept me from panicking in high-stress situations.  What a thrill The voice cast still delivers, and The Cobra Unit is just as compelling, with big moments still having the right impact. The ladder scene took me right back to playing the original on my grandmother’s floor all those years ago.  Paradoxes, easter eggs, and all the details I’d expect are still in place. I didn’t encounter any moments that felt off or deviated too far in any way from the script. The opening theme and intro movie have been remixed, and while it will come down to personal taste, every note still hits for me.  Metal Gear Solid Delta: Snake Eater launches on August 28 for PS5, and is a day to mark on your calendar whether you’re a longtime fan or series newcomer interested in discovering the celebrated origins of the storyline.   Metal Gear Solid Delta: Snake Eater developers discuss the game in length in a new interview.
    Like
    Love
    Wow
    Angry
    Sad
    387
    · 2 Commentaires ·0 Parts
  • Du Caire à l’Algérie, les premiers pas de Maria font le tour du web [VIDÉO]

    C’est un moment émouvant pour une petite fille algérienne qui a réussi à marcher pour la première fois, sans assistance, grâce à l’expertise égyptienne dansL’article Du Caire à l’Algérie, les premiers pas de Maria font le tour du webest apparu en premier sur .
    #caire #lalgérie #les #premiers #pas
    Du Caire à l’Algérie, les premiers pas de Maria font le tour du web [VIDÉO]
    C’est un moment émouvant pour une petite fille algérienne qui a réussi à marcher pour la première fois, sans assistance, grâce à l’expertise égyptienne dansL’article Du Caire à l’Algérie, les premiers pas de Maria font le tour du webest apparu en premier sur . #caire #lalgérie #les #premiers #pas
    Du Caire à l’Algérie, les premiers pas de Maria font le tour du web [VIDÉO]
    www.algerie360.com
    C’est un moment émouvant pour une petite fille algérienne qui a réussi à marcher pour la première fois, sans assistance, grâce à l’expertise égyptienne dans […] L’article Du Caire à l’Algérie, les premiers pas de Maria font le tour du web [VIDÉO] est apparu en premier sur .
    Like
    Love
    Wow
    Sad
    Angry
    395
    · 2 Commentaires ·0 Parts
Plus de résultats
ollo https://www.ollo.ws