• يا جماعة، IFA 2025 فاتح أبوابه في برلين وبدأت الأخبار تتسرب على كل ما هو جديد في عالم التكنولوجيات!

    هذا المعرض هو أحد أكبر المعارض الخاصة بتكنولوجيا المستهلك في أوروبا، والأنباء حول الأدوات، التقنيات، والمزايا الجديدة تتوالى بزاااف! إذا كنت حاب تعرف شنو الجديد من gadgets وابتكارات، المقال هذا يتناول أحسن الإعلانات اللي خرجت لحد الآن.

    شخصيا، كل ما نشوف جديد في عالم التكنولوجيا، يحفزني على تجربة الأشياء الجديدة! مثلاً، سمعت عن بعض الأجهزة الذكية اللي راح تخلي حياتنا أسهل وأكثر متعة، ونحب نعرف آرائكم حولها!

    ما تنسوش تتطلعوا على التفاصيل أكثر في الرابط!

    https://www.theverge.com/news/769573/ifa-2025-smart-home-lights-power-bank-robot-vacuum-ai-headphones

    #تكنولوجيا #IFA2025 #gadgets #مستقبل #ابتكار
    🚀 يا جماعة، IFA 2025 فاتح أبوابه في برلين وبدأت الأخبار تتسرب على كل ما هو جديد في عالم التكنولوجيات! 📱💡 هذا المعرض هو أحد أكبر المعارض الخاصة بتكنولوجيا المستهلك في أوروبا، والأنباء حول الأدوات، التقنيات، والمزايا الجديدة تتوالى بزاااف! إذا كنت حاب تعرف شنو الجديد من gadgets وابتكارات، المقال هذا يتناول أحسن الإعلانات اللي خرجت لحد الآن. شخصيا، كل ما نشوف جديد في عالم التكنولوجيا، يحفزني على تجربة الأشياء الجديدة! مثلاً، سمعت عن بعض الأجهزة الذكية اللي راح تخلي حياتنا أسهل وأكثر متعة، ونحب نعرف آرائكم حولها! ما تنسوش تتطلعوا على التفاصيل أكثر في الرابط! https://www.theverge.com/news/769573/ifa-2025-smart-home-lights-power-bank-robot-vacuum-ai-headphones #تكنولوجيا #IFA2025 #gadgets #مستقبل #ابتكار
    www.theverge.com
    The doors to IFA 2025 in Berlin, Germany, have officially opened and new gadgets, tech, features, and upgrades continue to pour out of Europe's largest consumer tech show. There's a lot of news to stay on top of, and if you're struggling to ingest it
    Like
    Love
    Wow
    Sad
    Angry
    569
    · 1 التعليقات ·0 المشاركات
  • يا جماعة، بصح كيفاش تفكروا في يوم من الأيام تشوفو روبوتات تتفاعل مع بعضهم؟ من شحال كنت نسمع على OpenAI و الابتكارات تاعهم، و اليوم حبيت نحكي لكم على Symposium تاعهم في 2019.

    في 27 أفريل 2019، داروا أوّل OpenAI Robotics Symposium، وين تناقشوا على أهم التطورات في عالم الروبوتات و كيفاش ممكن يبدلوا حياتنا. كانت عندي الفرصة نشارك في ورشة صغيرة معهم، و شفت كيفاش الناس تشتغل بشغف على مشاريع مذهلة.

    بصراحة، أنا متفائل بالقدرات هذي، و نشوف فيها مستقبل واعد. تخيلوا معايا، روبوتات تعاونا في كل مجالات الحياة، من الطب للزراعة.

    و بصح، نحبكم تفكروا كيفاش هالتكنولوجيا ممكن تغير من نظرتنا للعالم.

    https://openai.com/index/symposium-2019

    #روبوتات #OpenAI #تكنولوجيا #Innovation #Symposium
    يا جماعة، بصح كيفاش تفكروا في يوم من الأيام تشوفو روبوتات تتفاعل مع بعضهم؟ 😍 من شحال كنت نسمع على OpenAI و الابتكارات تاعهم، و اليوم حبيت نحكي لكم على Symposium تاعهم في 2019. في 27 أفريل 2019، داروا أوّل OpenAI Robotics Symposium، وين تناقشوا على أهم التطورات في عالم الروبوتات و كيفاش ممكن يبدلوا حياتنا. كانت عندي الفرصة نشارك في ورشة صغيرة معهم، و شفت كيفاش الناس تشتغل بشغف على مشاريع مذهلة. بصراحة، أنا متفائل بالقدرات هذي، و نشوف فيها مستقبل واعد. تخيلوا معايا، روبوتات تعاونا في كل مجالات الحياة، من الطب للزراعة. و بصح، نحبكم تفكروا كيفاش هالتكنولوجيا ممكن تغير من نظرتنا للعالم. https://openai.com/index/symposium-2019 #روبوتات #OpenAI #تكنولوجيا #Innovation #Symposium
    openai.com
    We hosted the first OpenAI Robotics Symposium on April 27, 2019.
    Like
    Love
    Wow
    Sad
    Angry
    842
    · 1 التعليقات ·0 المشاركات
  • يا جماعة، شفتوا الجديد من تسلا؟

    درك أي واحد يقدر ينزل تطبيق Robotaxi على الآيفون، والجميل هو أن الخدمة راح تخرج من دائرة المستخدمين الأوائل! لكن، ماشي كلشي ساهل، كاين قائمة انتظار، يعني ما تقدرش تحجز سيارة مباشرة.

    التطبيق متوفر في أوستن، تكساس هادي، وبصراحة، أنا شغوف نعرف كيفاش تكون تجربة الركوب في سيارة بدون سائق! تخيلوا معايا، تتقلّب في سيارة تكتفي بروحها. هل راح يكون هاكير أو كاين مشكل في السواقة؟

    التكنولوجيا تتطور بسرعة، والانتظار زعما راح ينتهي قريبا. راح نشوف كيفاش راح تكون هاذي الرحلة!

    https://www.engadget.com/transportation/evs/anyone-can-now-download-teslas-ios-robotaxi-app-but-theres-still-a-waitlist-165523028.html?src=rss

    #تسلا #Robotaxi #التكنولوجيا #سيارات_ذكية #innovation
    يا جماعة، شفتوا الجديد من تسلا؟ 😲 درك أي واحد يقدر ينزل تطبيق Robotaxi على الآيفون، والجميل هو أن الخدمة راح تخرج من دائرة المستخدمين الأوائل! لكن، ماشي كلشي ساهل، كاين قائمة انتظار، يعني ما تقدرش تحجز سيارة مباشرة. 🤔 التطبيق متوفر في أوستن، تكساس هادي، وبصراحة، أنا شغوف نعرف كيفاش تكون تجربة الركوب في سيارة بدون سائق! تخيلوا معايا، تتقلّب في سيارة تكتفي بروحها. هل راح يكون هاكير أو كاين مشكل في السواقة؟ 😂 التكنولوجيا تتطور بسرعة، والانتظار زعما راح ينتهي قريبا. راح نشوف كيفاش راح تكون هاذي الرحلة! https://www.engadget.com/transportation/evs/anyone-can-now-download-teslas-ios-robotaxi-app-but-theres-still-a-waitlist-165523028.html?src=rss #تسلا #Robotaxi #التكنولوجيا #سيارات_ذكية #innovation
    www.engadget.com
    Tesla has made its iOS Robotaxi app available for all iPhone users to download as it expands service beyond a group of early access users for the first time. Beyond the current lack of an Android app (Tesla says one is on the way), there are a couple
    Like
    Love
    Wow
    Sad
    Angry
    730
    · 1 التعليقات ·0 المشاركات
  • كيفاش ممكن التكنولوجيا تغير طريقة الفلاحة في بلادنا؟

    عندنا خبر جديد على Orchard Robotics، مؤسسة ناشئة أسسها واحد من زملاء Thiel وخرّيج من Cornell، جابوا 22 مليون دولار من التمويل! الفكرة بسيطة ومبسطة: يستخدموا كاميرات مركبة على الشاحنات لجمع وتحليل بيانات حول صحة المحاصيل. يعني من خلال الذكاء الاصطناعي، يقدرو يعرفو بالضبط كيفاش راهي المحاصيل ديما بخير ولا لا.

    هذا الموضوع يحمسني بزاف، لأنو الفلاحة عندها دور كبير في اقتصادنا، وتطويرها بالتكنولوجيا يقدر يفتح آفاق جديدة للمزارعين. تقدر تلاقيوا حلول مبتكرة تجعل الإنتاجية أحسن.

    خلينا نفكروا في كيفاش نقدروا نستغلو هالتكنولوجيا الجديدة في بلادنا ونحسنوا حياتنا الزراعية.

    https://techcrunch.com/2025/09/03/orchard-robotics-founded-by-a-thiel-fellow-cornell-dropout-raises-22m-for-farm-vision-ai/

    #فلاحة #تكنولوجيا #
    كيفاش ممكن التكنولوجيا تغير طريقة الفلاحة في بلادنا؟ 🤔 عندنا خبر جديد على Orchard Robotics، مؤسسة ناشئة أسسها واحد من زملاء Thiel وخرّيج من Cornell، جابوا 22 مليون دولار من التمويل! الفكرة بسيطة ومبسطة: يستخدموا كاميرات مركبة على الشاحنات لجمع وتحليل بيانات حول صحة المحاصيل. يعني من خلال الذكاء الاصطناعي، يقدرو يعرفو بالضبط كيفاش راهي المحاصيل ديما بخير ولا لا. هذا الموضوع يحمسني بزاف، لأنو الفلاحة عندها دور كبير في اقتصادنا، وتطويرها بالتكنولوجيا يقدر يفتح آفاق جديدة للمزارعين. تقدر تلاقيوا حلول مبتكرة تجعل الإنتاجية أحسن. خلينا نفكروا في كيفاش نقدروا نستغلو هالتكنولوجيا الجديدة في بلادنا ونحسنوا حياتنا الزراعية. https://techcrunch.com/2025/09/03/orchard-robotics-founded-by-a-thiel-fellow-cornell-dropout-raises-22m-for-farm-vision-ai/ #فلاحة #تكنولوجيا #
    techcrunch.com
    The startup uses truck-mounted cameras to collect and analyze data about crop health.
    Like
    Love
    Wow
    Sad
    Angry
    709
    · 1 التعليقات ·0 المشاركات
  • شكون يقدر يتخيل عالم بلا شغل؟ هذي هي الفكرة اللي طرحها Elon Musk لما اقترح "دخل عالمي مرتفع".

    في المقال هادا، كيش رح نكسبو رزقنا في عصر الذكاء الاصطناعي؟ ديما نسمعو على الروبوتات و كيفاش راهم يستبدلو الناس في بعض المهن. بس، واش راح نعملو؟ Musk يقول بلي الحل هو "دخل عالمي" يضمن لنا الأمان المالي و يخلينا نعيشو بلا قلق.

    شخصياً، هاذ الفكرة تخليني نفكر كثير. كاين بزاف مواضيع حساسة، و كيفاش ممكن نستعدو لزمان تقنيا متقدما، كاش واحد فينا عايش هاد التغييرات و يحس بيها؟

    تخيلوا معايا مستقبل بلا قلق على الشغل، بس في نفس الوقت، كيفاش نقدروا نطوّروا روحنا و مهاراتنا؟

    https://www.wsj.com/opinion/how-will-we-earn-an-income-in-the-age-of-ai-economy-jobs-robots-wage-a87c
    🌟 شكون يقدر يتخيل عالم بلا شغل؟ هذي هي الفكرة اللي طرحها Elon Musk لما اقترح "دخل عالمي مرتفع". 🤔 في المقال هادا، كيش رح نكسبو رزقنا في عصر الذكاء الاصطناعي؟ ديما نسمعو على الروبوتات و كيفاش راهم يستبدلو الناس في بعض المهن. بس، واش راح نعملو؟ Musk يقول بلي الحل هو "دخل عالمي" يضمن لنا الأمان المالي و يخلينا نعيشو بلا قلق. شخصياً، هاذ الفكرة تخليني نفكر كثير. كاين بزاف مواضيع حساسة، و كيفاش ممكن نستعدو لزمان تقنيا متقدما، كاش واحد فينا عايش هاد التغييرات و يحس بيها؟ تخيلوا معايا مستقبل بلا قلق على الشغل، بس في نفس الوقت، كيفاش نقدروا نطوّروا روحنا و مهاراتنا؟ https://www.wsj.com/opinion/how-will-we-earn-an-income-in-the-age-of-ai-economy-jobs-robots-wage-a87c
    www.wsj.com
    Elon Musk is on the right track with suggestions of a ‘universal high income.’
    Like
    Love
    Wow
    Sad
    Angry
    86
    · 1 التعليقات ·0 المشاركات
  • معاكم خبر زوين اليوم!

    شركة Clearwave Fiber قررت تدعم فريق Voltron Robotics في Lansing High School، وهذا خطوة كبيرة نحو تحسين الروابط المحلية وتعزيز النمو الاقتصادي من خلال الإنترنت السريع. هذي المساهمة راح تساعد الشباب في تطوير مهاراتهم، وتفتح لهم فرص جديدة في عالم التكنولوجيا.

    شخصياً، كنا نحتاج مثل هذي المبادرات كي نرفع مستوى التعليم والتكنولوجيا في بلادنا. ياما من أصدقاء لي شاركوا في فرق مشابهة، والنتائج كانت مبهرة!

    خلينا كل واحد فينا يدعم المواهب المحلية ويشجع الشباب على الابتكار!

    https://www.globenewswire.com/news-release/2025/08/28/3141015/0/en/Clearwave-Fiber-Supports-Lansing-High-School-Voltron-Robotics-Team.html

    #ابتكار #Robotics #دعم_المواهب #التكنولوجيا #Lansing
    🌟 معاكم خبر زوين اليوم! 🌟 شركة Clearwave Fiber قررت تدعم فريق Voltron Robotics في Lansing High School، وهذا خطوة كبيرة نحو تحسين الروابط المحلية وتعزيز النمو الاقتصادي من خلال الإنترنت السريع. 🎉💻 هذي المساهمة راح تساعد الشباب في تطوير مهاراتهم، وتفتح لهم فرص جديدة في عالم التكنولوجيا. شخصياً، كنا نحتاج مثل هذي المبادرات كي نرفع مستوى التعليم والتكنولوجيا في بلادنا. ياما من أصدقاء لي شاركوا في فرق مشابهة، والنتائج كانت مبهرة! 👩‍🔧👨‍🔧 خلينا كل واحد فينا يدعم المواهب المحلية ويشجع الشباب على الابتكار! https://www.globenewswire.com/news-release/2025/08/28/3141015/0/en/Clearwave-Fiber-Supports-Lansing-High-School-Voltron-Robotics-Team.html #ابتكار #Robotics #دعم_المواهب #التكنولوجيا #Lansing
    www.globenewswire.com
    LANSING, Kan., Aug. 28, 2025 (GLOBE NEWSWIRE) -- Clearwave Fiber is excited to announce a donation to the Lansing High School Voltron Robotics Team, highlighting a dedication to strengthening local connections and fostering economic growth through
    Like
    Love
    Wow
    Sad
    Angry
    546
    · 1 التعليقات ·0 المشاركات
  • يا جماعة، تخيلوا معايا! كاين واحد الصغير اسمه Jitender، في نيودلهي، كان يراقب والديه و هما يقوموا بواحد العمل الصعب: تنظيف المجاري بيدهم. اليوم، هو واحد من 200 متعاقد يعملوا مع الحكومة باش يحوّلوا من هاد الطريقة التقليدية إلى طرق آمنة ميكانيكية. رغم أنه تم حظرها، لكن المشاكل مازالت موجودة.

    المقال يبين لنا كيف التكنولوجيا تقدر تغيّر حياة الناس، لكنها ما تزال بعيدة على بعض المناطق. بصراحة، هاد الموضوع يخليني نفكر في كيفاش كاين ناس يعيشوا واقع مختلف عنا، والتحديات اللي يتواجهوها.

    إذا كل واحد فينا حاول يدير تغيير صغير في مجاله، الفارق رح يكون كبير.

    https://www.technologyreview.com/2025/08/27/1121423/india-sewer-robots-sanitation/

    #تكنولوجيا #الهند #تغيير #مجاري #روبوتات
    يا جماعة، تخيلوا معايا! كاين واحد الصغير اسمه Jitender، في نيودلهي، كان يراقب والديه و هما يقوموا بواحد العمل الصعب: تنظيف المجاري بيدهم. اليوم، هو واحد من 200 متعاقد يعملوا مع الحكومة باش يحوّلوا من هاد الطريقة التقليدية إلى طرق آمنة ميكانيكية. رغم أنه تم حظرها، لكن المشاكل مازالت موجودة. المقال يبين لنا كيف التكنولوجيا تقدر تغيّر حياة الناس، لكنها ما تزال بعيدة على بعض المناطق. بصراحة، هاد الموضوع يخليني نفكر في كيفاش كاين ناس يعيشوا واقع مختلف عنا، والتحديات اللي يتواجهوها. إذا كل واحد فينا حاول يدير تغيير صغير في مجاله، الفارق رح يكون كبير. https://www.technologyreview.com/2025/08/27/1121423/india-sewer-robots-sanitation/ #تكنولوجيا #الهند #تغيير #مجاري #روبوتات
    www.technologyreview.com
    When Jitender was a child in New Delhi, both his parents worked as manual scavengers—a job that involved clearing the city’s sewers of solid waste by hand. Now, he is among almost 200 contractors involved in the Delhi government’s effort to shift fro
    Like
    Love
    Wow
    Angry
    Sad
    1كيلو بايت
    · 1 التعليقات ·0 المشاركات
  • De Meta AI à ChatGPT, le jeu dangereux d’une personnalisation toujours plus poussée des IA

    SOLÈNE REVENEY / « LE MONDE » C’est un paradoxe dans lequel bien des éditeurs d’intelligences artificiellessont empêtrés : pourquoi investir tant d’argent et d’efforts dans la création d’IA plus sûres et plus neutres si, une fois commercialisés, ces robots conversationnels peuvent être largement dévoyés ? Prenons le cas d’OpenAI. En avril, l’éditeur de ChatGPT dit réfléchir publiquement à rendre son robot conversationnel moins obséquieux, car il risque d’engager des personnes fragiles dans une dépendance malsaine, selon le constat dressé par ses propres chercheurs un mois auparavant. Plusieurs affaires médiatiques, rapportées notamment par le New York Times, Ars Technica, Futurism ou l’agence Reuters, ont illustré la dangereuse spirale dans laquelle certains utilisateurs peuvent tomber, avec à la clé un décrochage parfois fatal avec la réalité. Alors le 7 août, OpenAI saute le pas, annonçant une nouvelle version de son robot, GPT-5. Son ton est résolument plus froid, et il surveille désormais la durée des conversations pour suggérer des pauses quand il le juge nécessaire. Lire aussi | Article réservé à nos abonnés ChatGPT : les débuts difficiles de GPT-5, le nouveau modèle de langage d’OpenAI, jugé moins efficace et moins « amical » L’entreprise applique ainsi deux recommandations-clés issues de chercheurs en IA parmi les plus critiques, dont des experts de Google, auteurs en 2024 d’une longue recension des dangers de l’IA. Dans ce document, ceux-ci anticipent un monde où la flatterie des robots réduit « les occasions qu’ont les humains de grandir et de se développerdevenir de meilleures versions d’eux-mêmes ». Voire « un monde où les utilisateurs abandonnent les interactions compliquées et imparfaitesavec les humains en faveur des échanges sans frictions fournis par les IA ». Il vous reste 73.15% de cet article à lire. La suite est réservée aux abonnés.
    #meta #chatgpt #jeu #dangereux #dune
    De Meta AI à ChatGPT, le jeu dangereux d’une personnalisation toujours plus poussée des IA
    SOLÈNE REVENEY / « LE MONDE » C’est un paradoxe dans lequel bien des éditeurs d’intelligences artificiellessont empêtrés : pourquoi investir tant d’argent et d’efforts dans la création d’IA plus sûres et plus neutres si, une fois commercialisés, ces robots conversationnels peuvent être largement dévoyés ? Prenons le cas d’OpenAI. En avril, l’éditeur de ChatGPT dit réfléchir publiquement à rendre son robot conversationnel moins obséquieux, car il risque d’engager des personnes fragiles dans une dépendance malsaine, selon le constat dressé par ses propres chercheurs un mois auparavant. Plusieurs affaires médiatiques, rapportées notamment par le New York Times, Ars Technica, Futurism ou l’agence Reuters, ont illustré la dangereuse spirale dans laquelle certains utilisateurs peuvent tomber, avec à la clé un décrochage parfois fatal avec la réalité. Alors le 7 août, OpenAI saute le pas, annonçant une nouvelle version de son robot, GPT-5. Son ton est résolument plus froid, et il surveille désormais la durée des conversations pour suggérer des pauses quand il le juge nécessaire. Lire aussi | Article réservé à nos abonnés ChatGPT : les débuts difficiles de GPT-5, le nouveau modèle de langage d’OpenAI, jugé moins efficace et moins « amical » L’entreprise applique ainsi deux recommandations-clés issues de chercheurs en IA parmi les plus critiques, dont des experts de Google, auteurs en 2024 d’une longue recension des dangers de l’IA. Dans ce document, ceux-ci anticipent un monde où la flatterie des robots réduit « les occasions qu’ont les humains de grandir et de se développerdevenir de meilleures versions d’eux-mêmes ». Voire « un monde où les utilisateurs abandonnent les interactions compliquées et imparfaitesavec les humains en faveur des échanges sans frictions fournis par les IA ». Il vous reste 73.15% de cet article à lire. La suite est réservée aux abonnés. #meta #chatgpt #jeu #dangereux #dune
    De Meta AI à ChatGPT, le jeu dangereux d’une personnalisation toujours plus poussée des IA
    www.lemonde.fr
    SOLÈNE REVENEY / « LE MONDE » C’est un paradoxe dans lequel bien des éditeurs d’intelligences artificielles (IA) sont empêtrés : pourquoi investir tant d’argent et d’efforts dans la création d’IA plus sûres et plus neutres si, une fois commercialisés, ces robots conversationnels peuvent être largement dévoyés ? Prenons le cas d’OpenAI. En avril, l’éditeur de ChatGPT dit réfléchir publiquement à rendre son robot conversationnel moins obséquieux, car il risque d’engager des personnes fragiles dans une dépendance malsaine, selon le constat dressé par ses propres chercheurs un mois auparavant. Plusieurs affaires médiatiques, rapportées notamment par le New York Times, Ars Technica, Futurism ou l’agence Reuters, ont illustré la dangereuse spirale dans laquelle certains utilisateurs peuvent tomber, avec à la clé un décrochage parfois fatal avec la réalité. Alors le 7 août, OpenAI saute le pas, annonçant une nouvelle version de son robot, GPT-5. Son ton est résolument plus froid, et il surveille désormais la durée des conversations pour suggérer des pauses quand il le juge nécessaire. Lire aussi | Article réservé à nos abonnés ChatGPT : les débuts difficiles de GPT-5, le nouveau modèle de langage d’OpenAI, jugé moins efficace et moins « amical » L’entreprise applique ainsi deux recommandations-clés issues de chercheurs en IA parmi les plus critiques, dont des experts de Google, auteurs en 2024 d’une longue recension des dangers de l’IA. Dans ce document, ceux-ci anticipent un monde où la flatterie des robots réduit « les occasions qu’ont les humains de grandir et de se développer [pour] devenir de meilleures versions d’eux-mêmes ». Voire « un monde où les utilisateurs abandonnent les interactions compliquées et imparfaites (…) avec les humains en faveur des échanges sans frictions fournis par les IA ». Il vous reste 73.15% de cet article à lire. La suite est réservée aux abonnés.
    Like
    Love
    Wow
    Sad
    Angry
    542
    · 2 التعليقات ·0 المشاركات
  • NVIDIA Jetson Thor Unlocks Real-Time Reasoning for General Robotics and Physical AI

    Robots around the world are about to get a lot smarter as physical AI developers plug in NVIDIA Jetson Thor modules — new robotics computers that can serve as the brains for robotic systems across research and industry.
    Robots demand rich sensor data and low-latency AI processing. Running real-time robotic applications requires significant AI compute and memory to handle concurrent data streams from multiple sensors. Jetson Thor, now in general availability, delivers 7.5x more AI compute, 3.1x more CPU performance and 2x more memory than its predecessor, the NVIDIA Jetson Orin, to make this possible on device.
    This performance leap will enable roboticists to process high-speed sensor data and perform visual reasoning at the edge — workflows that were previously too slow to run in dynamic real-world environments. This opens new possibilities for multimodal AI applications such as humanoid robotics.

    Agility Robotics, a leader in humanoid robotics, has integrated NVIDIA Jetson into the fifth generation of its robot, Digit — and plans to adopt Jetson Thor as the onboard compute platform for the sixth generation of Digit. This transition will enhance Digit’s real-time perception and decision-making capabilities, supporting increasingly complex AI skills and behaviors. Digit is commercially deployed and performs logistics tasks such as stacking, loading and palletizing in warehouse and manufacturing environments.
    “The powerful edge processing offered by Jetson Thor will take Digit to the next level — enhancing its real-time responsiveness and expanding its abilities to a broader, more complex set of skills,” said Peggy Johnson, CEO of Agility Robotics. “With Jetson Thor, we can deliver the latest physical AI advancements to optimize operations across our customers’ warehouses and factories.”
    Boston Dynamics — which has been building some of the industry’s most advanced robots for over 30 years — is integrating Jetson Thor into its humanoid robot Atlas, enabling Atlas to harness formerly server-level compute, AI workload acceleration, high-bandwidth data processing and significant memory on device.
    Beyond humanoids, Jetson Thor will accelerate various robotic applications — such as surgical assistants, smart tractors, delivery robots, industrial manipulators and visual AI agents — with real-time inference on device for larger, more complex AI models.
    A Giant Leap for Real-Time Robot Reasoning
    Jetson Thor is built for generative reasoning models. It enables the next generation of physical AI agents — powered by large transformer models, vision language models and vision language action models — to run in real time at the edge while minimizing cloud dependency.
    Optimized with the Jetson software stack to enable the low latency and high performance required in real-world applications, Jetson Thor supports all popular generative AI frameworks and AI reasoning models with unmatched real-time performance. These include Cosmos Reason, DeepSeek, Llama, Gemini and Qwen models, as well as domain-specific models for robotics like Isaac GR00T N1.5, enabling any developer to easily experiment and run inference locally.
    NVIDIA Jetson Thor opens new capabilities for real-time reasoning with multi-sensor input. Further performance improvement is expected with FP4 and speculative decoding optimization.
    With NVIDIA CUDA ecosystem support through its lifecycle, Jetson Thor is expected to deliver even better throughput and faster responses with future software releases.
    Jetson Thor modules also run the full NVIDIA AI software stack to accelerate virtually every physical AI workflow with platforms including NVIDIA Isaac for robotics, NVIDIA Metropolis for video analytics AI agents and NVIDIA Holoscan for sensor processing.
    With these software tools, developers can easily build and deploy applications, such as visual AI agents that can analyze live camera streams to monitor worker safety, humanoid robots capable of manipulation tasks in unstructured environments and smart operating rooms that guide surgeons based on data from multi-camera streams.
    Jetson Thor Set to Advance Research Innovation 
    Research labs at Stanford University, Carnegie Mellon University and the University of Zurich are tapping Jetson Thor to push the boundaries of perception, planning and navigation models for a host of potential applications.
    At Carnegie Mellon’s Robotics Institute, a research team uses NVIDIA Jetson to power autonomous robots that can navigate complex, unstructured environments to conduct medical triage as well as search and rescue.
    “We can only do as much as the compute available allows,” said Sebastian Scherer, an associate research professor at the university and head of the AirLab. “Years ago, there was a big disconnect between computer vision and robotics because computer vision workloads were too slow for real-time decision-making — but now, models and computing have gotten fast enough so robots can handle much more nuanced tasks.”
    Scherer anticipates that by upgrading from his team’s existing NVIDIA Jetson AGX Orin systems to Jetson AGX Thor developer kit, they’ll improve the performance of AI models including their award-winning MAC-VO model for robot perception at the edge, boost their sensor-fusion capabilities and be able to experiment with robot fleets.
    Wield the Strength of Jetson Thor
    The Jetson Thor family includes a developer kit and production modules. The developer kit includes a Jetson T5000 module, a reference carrier board with abundant connectivity, an active heatsink with a fan and a power supply.
    NVIDIA Jetson AGX Thor Developer Kit
    The Jetson ecosystem supports a variety of application requirements, high-speed industrial automation protocols and sensor interfaces, accelerating time to market for enterprise developers. Hardware partners including Advantech, Aetina, ConnectTech, MiiVii and TZTEK are building production-ready Jetson Thor systems with flexible I/O and custom configurations in various form factors.
    Sensor and Actuator companies including Analog Devices, Inc., e-con Systems,  Infineon, Leopard Imaging, RealSense and Sensing are using NVIDIA Holoscan Sensor Bridge — a platform that simplifies sensor fusion and data streaming — to connect sensor data from cameras, radar, lidar and more directly to GPU memory on Jetson Thor with ultralow latency.
    Thousands of software companies can now elevate their traditional vision AI and robotics applications with multi-AI agent workflows running on Jetson Thor. Leading adopters include Openzeka, Rebotnix, Solomon and Vaidio.
    More than 2 million developers use NVIDIA technologies to accelerate robotics workflows. Get started with Jetson Thor by reading the NVIDIA Technical Blog and watching the developer kit walkthrough.

    To get hands-on experience with Jetson Thor, sign up to participate in upcoming hackathons with Seeed Studio and LeRobot by Hugging Face.
    The NVIDIA Jetson AGX Thor developer kit is available now starting at NVIDIA Jetson T5000 modules are available starting at for 1,000 units. Buy now from authorized NVIDIA partners.
    NVIDIA today also announced that the NVIDIA DRIVE AGX Thor developer kit, which provides a platform for developing autonomous vehicles and mobility solutions, is available for preorder. Deliveries are slated to start in September.
    #nvidia #jetson #thor #unlocks #realtime
    NVIDIA Jetson Thor Unlocks Real-Time Reasoning for General Robotics and Physical AI
    Robots around the world are about to get a lot smarter as physical AI developers plug in NVIDIA Jetson Thor modules — new robotics computers that can serve as the brains for robotic systems across research and industry. Robots demand rich sensor data and low-latency AI processing. Running real-time robotic applications requires significant AI compute and memory to handle concurrent data streams from multiple sensors. Jetson Thor, now in general availability, delivers 7.5x more AI compute, 3.1x more CPU performance and 2x more memory than its predecessor, the NVIDIA Jetson Orin, to make this possible on device. This performance leap will enable roboticists to process high-speed sensor data and perform visual reasoning at the edge — workflows that were previously too slow to run in dynamic real-world environments. This opens new possibilities for multimodal AI applications such as humanoid robotics. Agility Robotics, a leader in humanoid robotics, has integrated NVIDIA Jetson into the fifth generation of its robot, Digit — and plans to adopt Jetson Thor as the onboard compute platform for the sixth generation of Digit. This transition will enhance Digit’s real-time perception and decision-making capabilities, supporting increasingly complex AI skills and behaviors. Digit is commercially deployed and performs logistics tasks such as stacking, loading and palletizing in warehouse and manufacturing environments. “The powerful edge processing offered by Jetson Thor will take Digit to the next level — enhancing its real-time responsiveness and expanding its abilities to a broader, more complex set of skills,” said Peggy Johnson, CEO of Agility Robotics. “With Jetson Thor, we can deliver the latest physical AI advancements to optimize operations across our customers’ warehouses and factories.” Boston Dynamics — which has been building some of the industry’s most advanced robots for over 30 years — is integrating Jetson Thor into its humanoid robot Atlas, enabling Atlas to harness formerly server-level compute, AI workload acceleration, high-bandwidth data processing and significant memory on device. Beyond humanoids, Jetson Thor will accelerate various robotic applications — such as surgical assistants, smart tractors, delivery robots, industrial manipulators and visual AI agents — with real-time inference on device for larger, more complex AI models. A Giant Leap for Real-Time Robot Reasoning Jetson Thor is built for generative reasoning models. It enables the next generation of physical AI agents — powered by large transformer models, vision language models and vision language action models — to run in real time at the edge while minimizing cloud dependency. Optimized with the Jetson software stack to enable the low latency and high performance required in real-world applications, Jetson Thor supports all popular generative AI frameworks and AI reasoning models with unmatched real-time performance. These include Cosmos Reason, DeepSeek, Llama, Gemini and Qwen models, as well as domain-specific models for robotics like Isaac GR00T N1.5, enabling any developer to easily experiment and run inference locally. NVIDIA Jetson Thor opens new capabilities for real-time reasoning with multi-sensor input. Further performance improvement is expected with FP4 and speculative decoding optimization. With NVIDIA CUDA ecosystem support through its lifecycle, Jetson Thor is expected to deliver even better throughput and faster responses with future software releases. Jetson Thor modules also run the full NVIDIA AI software stack to accelerate virtually every physical AI workflow with platforms including NVIDIA Isaac for robotics, NVIDIA Metropolis for video analytics AI agents and NVIDIA Holoscan for sensor processing. With these software tools, developers can easily build and deploy applications, such as visual AI agents that can analyze live camera streams to monitor worker safety, humanoid robots capable of manipulation tasks in unstructured environments and smart operating rooms that guide surgeons based on data from multi-camera streams. Jetson Thor Set to Advance Research Innovation  Research labs at Stanford University, Carnegie Mellon University and the University of Zurich are tapping Jetson Thor to push the boundaries of perception, planning and navigation models for a host of potential applications. At Carnegie Mellon’s Robotics Institute, a research team uses NVIDIA Jetson to power autonomous robots that can navigate complex, unstructured environments to conduct medical triage as well as search and rescue. “We can only do as much as the compute available allows,” said Sebastian Scherer, an associate research professor at the university and head of the AirLab. “Years ago, there was a big disconnect between computer vision and robotics because computer vision workloads were too slow for real-time decision-making — but now, models and computing have gotten fast enough so robots can handle much more nuanced tasks.” Scherer anticipates that by upgrading from his team’s existing NVIDIA Jetson AGX Orin systems to Jetson AGX Thor developer kit, they’ll improve the performance of AI models including their award-winning MAC-VO model for robot perception at the edge, boost their sensor-fusion capabilities and be able to experiment with robot fleets. Wield the Strength of Jetson Thor The Jetson Thor family includes a developer kit and production modules. The developer kit includes a Jetson T5000 module, a reference carrier board with abundant connectivity, an active heatsink with a fan and a power supply. NVIDIA Jetson AGX Thor Developer Kit The Jetson ecosystem supports a variety of application requirements, high-speed industrial automation protocols and sensor interfaces, accelerating time to market for enterprise developers. Hardware partners including Advantech, Aetina, ConnectTech, MiiVii and TZTEK are building production-ready Jetson Thor systems with flexible I/O and custom configurations in various form factors. Sensor and Actuator companies including Analog Devices, Inc., e-con Systems,  Infineon, Leopard Imaging, RealSense and Sensing are using NVIDIA Holoscan Sensor Bridge — a platform that simplifies sensor fusion and data streaming — to connect sensor data from cameras, radar, lidar and more directly to GPU memory on Jetson Thor with ultralow latency. Thousands of software companies can now elevate their traditional vision AI and robotics applications with multi-AI agent workflows running on Jetson Thor. Leading adopters include Openzeka, Rebotnix, Solomon and Vaidio. More than 2 million developers use NVIDIA technologies to accelerate robotics workflows. Get started with Jetson Thor by reading the NVIDIA Technical Blog and watching the developer kit walkthrough. To get hands-on experience with Jetson Thor, sign up to participate in upcoming hackathons with Seeed Studio and LeRobot by Hugging Face. The NVIDIA Jetson AGX Thor developer kit is available now starting at NVIDIA Jetson T5000 modules are available starting at for 1,000 units. Buy now from authorized NVIDIA partners. NVIDIA today also announced that the NVIDIA DRIVE AGX Thor developer kit, which provides a platform for developing autonomous vehicles and mobility solutions, is available for preorder. Deliveries are slated to start in September. #nvidia #jetson #thor #unlocks #realtime
    NVIDIA Jetson Thor Unlocks Real-Time Reasoning for General Robotics and Physical AI
    blogs.nvidia.com
    Robots around the world are about to get a lot smarter as physical AI developers plug in NVIDIA Jetson Thor modules — new robotics computers that can serve as the brains for robotic systems across research and industry. Robots demand rich sensor data and low-latency AI processing. Running real-time robotic applications requires significant AI compute and memory to handle concurrent data streams from multiple sensors. Jetson Thor, now in general availability, delivers 7.5x more AI compute, 3.1x more CPU performance and 2x more memory than its predecessor, the NVIDIA Jetson Orin, to make this possible on device. This performance leap will enable roboticists to process high-speed sensor data and perform visual reasoning at the edge — workflows that were previously too slow to run in dynamic real-world environments. This opens new possibilities for multimodal AI applications such as humanoid robotics. Agility Robotics, a leader in humanoid robotics, has integrated NVIDIA Jetson into the fifth generation of its robot, Digit — and plans to adopt Jetson Thor as the onboard compute platform for the sixth generation of Digit. This transition will enhance Digit’s real-time perception and decision-making capabilities, supporting increasingly complex AI skills and behaviors. Digit is commercially deployed and performs logistics tasks such as stacking, loading and palletizing in warehouse and manufacturing environments. “The powerful edge processing offered by Jetson Thor will take Digit to the next level — enhancing its real-time responsiveness and expanding its abilities to a broader, more complex set of skills,” said Peggy Johnson, CEO of Agility Robotics. “With Jetson Thor, we can deliver the latest physical AI advancements to optimize operations across our customers’ warehouses and factories.” Boston Dynamics — which has been building some of the industry’s most advanced robots for over 30 years — is integrating Jetson Thor into its humanoid robot Atlas, enabling Atlas to harness formerly server-level compute, AI workload acceleration, high-bandwidth data processing and significant memory on device. Beyond humanoids, Jetson Thor will accelerate various robotic applications — such as surgical assistants, smart tractors, delivery robots, industrial manipulators and visual AI agents — with real-time inference on device for larger, more complex AI models. A Giant Leap for Real-Time Robot Reasoning Jetson Thor is built for generative reasoning models. It enables the next generation of physical AI agents — powered by large transformer models, vision language models and vision language action models — to run in real time at the edge while minimizing cloud dependency. Optimized with the Jetson software stack to enable the low latency and high performance required in real-world applications, Jetson Thor supports all popular generative AI frameworks and AI reasoning models with unmatched real-time performance. These include Cosmos Reason, DeepSeek, Llama, Gemini and Qwen models, as well as domain-specific models for robotics like Isaac GR00T N1.5, enabling any developer to easily experiment and run inference locally. NVIDIA Jetson Thor opens new capabilities for real-time reasoning with multi-sensor input. Further performance improvement is expected with FP4 and speculative decoding optimization. With NVIDIA CUDA ecosystem support through its lifecycle, Jetson Thor is expected to deliver even better throughput and faster responses with future software releases. Jetson Thor modules also run the full NVIDIA AI software stack to accelerate virtually every physical AI workflow with platforms including NVIDIA Isaac for robotics, NVIDIA Metropolis for video analytics AI agents and NVIDIA Holoscan for sensor processing. With these software tools, developers can easily build and deploy applications, such as visual AI agents that can analyze live camera streams to monitor worker safety, humanoid robots capable of manipulation tasks in unstructured environments and smart operating rooms that guide surgeons based on data from multi-camera streams. Jetson Thor Set to Advance Research Innovation  Research labs at Stanford University, Carnegie Mellon University and the University of Zurich are tapping Jetson Thor to push the boundaries of perception, planning and navigation models for a host of potential applications. At Carnegie Mellon’s Robotics Institute, a research team uses NVIDIA Jetson to power autonomous robots that can navigate complex, unstructured environments to conduct medical triage as well as search and rescue. “We can only do as much as the compute available allows,” said Sebastian Scherer, an associate research professor at the university and head of the AirLab. “Years ago, there was a big disconnect between computer vision and robotics because computer vision workloads were too slow for real-time decision-making — but now, models and computing have gotten fast enough so robots can handle much more nuanced tasks.” Scherer anticipates that by upgrading from his team’s existing NVIDIA Jetson AGX Orin systems to Jetson AGX Thor developer kit, they’ll improve the performance of AI models including their award-winning MAC-VO model for robot perception at the edge, boost their sensor-fusion capabilities and be able to experiment with robot fleets. Wield the Strength of Jetson Thor The Jetson Thor family includes a developer kit and production modules. The developer kit includes a Jetson T5000 module, a reference carrier board with abundant connectivity, an active heatsink with a fan and a power supply. NVIDIA Jetson AGX Thor Developer Kit The Jetson ecosystem supports a variety of application requirements, high-speed industrial automation protocols and sensor interfaces, accelerating time to market for enterprise developers. Hardware partners including Advantech, Aetina, ConnectTech, MiiVii and TZTEK are building production-ready Jetson Thor systems with flexible I/O and custom configurations in various form factors. Sensor and Actuator companies including Analog Devices, Inc. (ADI), e-con Systems,  Infineon, Leopard Imaging, RealSense and Sensing are using NVIDIA Holoscan Sensor Bridge — a platform that simplifies sensor fusion and data streaming — to connect sensor data from cameras, radar, lidar and more directly to GPU memory on Jetson Thor with ultralow latency. Thousands of software companies can now elevate their traditional vision AI and robotics applications with multi-AI agent workflows running on Jetson Thor. Leading adopters include Openzeka, Rebotnix, Solomon and Vaidio. More than 2 million developers use NVIDIA technologies to accelerate robotics workflows. Get started with Jetson Thor by reading the NVIDIA Technical Blog and watching the developer kit walkthrough. To get hands-on experience with Jetson Thor, sign up to participate in upcoming hackathons with Seeed Studio and LeRobot by Hugging Face. The NVIDIA Jetson AGX Thor developer kit is available now starting at $3,499. NVIDIA Jetson T5000 modules are available starting at $2,999 for 1,000 units. Buy now from authorized NVIDIA partners. NVIDIA today also announced that the NVIDIA DRIVE AGX Thor developer kit, which provides a platform for developing autonomous vehicles and mobility solutions, is available for preorder. Deliveries are slated to start in September.
    Like
    Love
    Wow
    Sad
    Angry
    797
    · 2 التعليقات ·0 المشاركات
  • واش راكم؟ عندي جديد يفرح القلب!

    Robomart خرجت علينا بروبوت توصيل جديد، RM5، اللي يقدر يشيل حتى 50 باوند ويجيبلك الطلبات للدار كأنك تشري من عند جارتك! والمفاجأة؟ التكلفة ثابتة بـ$3. يعني لا باهظ ولا تعقيدات.

    شوفوا، أنا دايماً نقول بلي التكنولوجيا لازم تكون في خدمتنا، لكن تخيلوا لو كان الروبوت هذا يجيبلك الطلبات وأنت في البيجاما؟! كاين أيام راني نتمنى لو كان عندي واحد في الدار، نقول للروبوت: “جيبيلي مشروبات، وعيد الكعك!”

    أنتم كيفاه تشوفوا هذا التوجه الجديد في عالم التوصيل؟

    https://techcrunch.com/2025/08/25/robomart-unveils-new-delivery-robot-with-3-flat-fee-to-challenge-doordash-uber-eats/

    #روبوت #توصيل #Innovations #TechAlgeria #Delivery
    ✨ واش راكم؟ عندي جديد يفرح القلب! 😄 Robomart خرجت علينا بروبوت توصيل جديد، RM5، اللي يقدر يشيل حتى 50 باوند ويجيبلك الطلبات للدار كأنك تشري من عند جارتك! 😂 والمفاجأة؟ التكلفة ثابتة بـ$3. يعني لا باهظ ولا تعقيدات. شوفوا، أنا دايماً نقول بلي التكنولوجيا لازم تكون في خدمتنا، لكن تخيلوا لو كان الروبوت هذا يجيبلك الطلبات وأنت في البيجاما؟! كاين أيام راني نتمنى لو كان عندي واحد في الدار، نقول للروبوت: “جيبيلي مشروبات، وعيد الكعك!” 😜 أنتم كيفاه تشوفوا هذا التوجه الجديد في عالم التوصيل؟ https://techcrunch.com/2025/08/25/robomart-unveils-new-delivery-robot-with-3-flat-fee-to-challenge-doordash-uber-eats/ #روبوت #توصيل #Innovations #TechAlgeria #Delivery
    techcrunch.com
    Robomart's RM5 autonomous delivery robot can carry up to 50 pounds and deliver multiple customer orders at once.
    Like
    Love
    Wow
    Sad
    Angry
    553
    · 1 التعليقات ·0 المشاركات
  • Creating a Detailed Helmet Inspired by Fallout Using Substance 3D

    IntroductionHi! My name is Pavel Vorobyev, and I'm a 19-year-old 3D Artist specializing in texturing and weapon creation for video games. I've been working in the industry for about 3 years now. During this time, I've had the opportunity to contribute to several exciting projects, including Arma, DayZ, Ratten Reich, and a NEXT-GEN sci-fi shooter. Here's my ArtStation portfolio.My journey into 3D art began in my early teens, around the age of 13 or 14. At some point, I got tired of just playing games and started wondering: "How are they actually made?" That question led me to explore game development. I tried everything – level design, programming, game design – but it was 3D art that truly captured me.I'm entirely self-taught. I learned everything from YouTube, tutorials, articles, and official documentation, gathering knowledge piece by piece. Breaking into the commercial side of the industry wasn't easy: there were a lot of failures, no opportunities, and no support. At one point, I even took a job at a metallurgical plant. But I kept pushing forward, kept learning and improving my skills in 3D. Eventually, I got my first industry offer – and that's when my real path began.Today, I continue to grow, constantly experimenting with new styles, tools, and techniques. For me, 3D isn't just a profession – it's a form of self-expression and a path toward my dream. My goal is to build a strong career in the game industry and eventually move into cinematic storytelling in the spirit of Love, Death & Robots.Astartes YouTube channelI also want to inspire younger artists and show how powerful texturing can be as a creative tool. To demonstrate that, I'd love to share my personal project PU – Part 1, which reflects my passion and approach to texture art.In this article, I'll be sharing my latest personal project – a semi-realistic sci-fi helmet that I created from scratch, experimenting with both form and style. It's a personal exploration where I aimed to step away from traditional hyperrealism and bring in a touch of artistic expression.Concept & Project IdeaThe idea behind this helmet project came from a very specific goal – to design a visually appealing asset with rich texture variation and achieve a balance between stylization and realism. I wanted to create something that looked believable, yet had an artistic flair. Since I couldn't find any fitting concepts online, I started building the design from scratch in my head. I eventually settled on creating a helmet as the main focus of the project. For visual direction, I drew inspiration from post-apocalyptic themes and the gritty aesthetics of Fallout and Warhammer 40,000.Software & Tools UsedFor this project, I used Blender, ZBrush, Substance 3D Painter, Marmoset Toolbag 5, Photoshop, and RizomUV. I created the low-poly mesh in Blender and developed the concept and high-poly sculpt in ZBrush. In Substance 3D Painter, I worked on the texture concept and final texturing. Baking and rendering were done in Marmoset Toolbag, and I used Photoshop for some adjustments to the bake. UV unwrapping was handled in RizomUV.Modeling & RetopologyI began the development process by designing the concept based on my earlier references – Fallout and Warhammer 40,000. The initial blockout was done in ZBrush, and from there, I started refining the shapes and details to create something visually engaging and stylistically bold.After completing the high-poly model, I moved on to the long and challenging process of retopology. Since I originally came from a weapons-focused background, I applied the knowledge I gained from modeling firearms. I slightly increased the polycount to achieve a cleaner and more appealing look in the final render – reducing visible faceting. My goal was to strike a balance between visual quality and a game-ready asset.UV Mapping & BakingNext, I moved on to UV mapping. There's nothing too complex about this stage, but since my goal was to create a game-ready asset, I made extensive use of overlaps. I did the UVs in Rizom UV. The most important part is to align the UV shells into clean strips and unwrap cylinders properly into straight lines.Once the UVs were done, I proceeded to bake the normal and ambient occlusion maps. At this stage, the key is having clean UVs and solid retopology – if those are in place, the bake goes smoothly. Texturing: Concept & WorkflowNow we move on to the most challenging stage – texturing. I aimed to present the project in a hyperrealistic style with a touch of stylization. This turned out to be quite difficult, and I went through many iterations. The most important part of this phase was developing a solid texture concept: rough decals, color combinations, and overall material direction. Without that foundation, it makes no sense to move forward with the texturing. After a long process of trial and error, I finally arrived at results I was satisfied with.Then I followed my pipeline:1. Working on the base materials2. Storytelling and damage3. Decals4. Spraying, dust, and dirtWorking on the Base MaterialsWhen working on the base materials, the main thing is to work with the physical properties and texture. You need to extract the maximum quality from the generators before manual processing. The idea was to create the feeling of an old, heavy helmet that had lived its life and had previously been painted a different color. To make it battered and, in a sense, rotten.It is important to pay attention to noise maps – Dirt 3, Dirt 6, White Noise, Flakes – and add the feel of old metal with custom Normal Maps. I also mixed in photo textures for a special charm. PhototextureCustom Normal Map TextureStorytelling & DamageGradients play an important role in the storytelling stage. They make the object artistically dynamic and beautiful, adding individual shades that bring the helmet to life.Everything else is done manually. I found a bunch of old helmets from World War II and took alpha damage shots of them using Photoshop. I drew the damage with alphas, trying to clearly separate the material into old paint, new paint, rust, and bare metal.I did the rust using MatFX Rust from the standard Substance 3D Painter library. I drew beautiful patterns using paint in multiply mode – this quickly helped to recreate the rust effect. Metal damage and old paint were more difficult: due to the large number of overlaps in the helmet, I had to carefully draw patterns, minimizing the visibility of overlaps.DecalsI drew the decals carefully, sticking to the concept, which added richness to the texture.Spray Paint & DirtFor spray paint and dirt, I used a long-established weapon template consisting of dust particles, sand particles, and spray paint. I analyzed references and applied them to crevices and logical places where dirt could accumulate.Rendering & Post-ProcessingI rendered in Marmoset Toolbag 5 using a new rendering format that I developed together with the team. The essence of the method is to simulate "RAW frames." Since Marmoset does not have such functions, I worked with the EXR 32-BIT format, which significantly improves the quality of the render: the shadows are smooth, without artifacts and broken gradients. I assembled the scene using Quixel Megascans. After rendering, I did post-processing in Photoshop utilizing Filter Camera Raw. Conclusion & Advice for BeginnersThat's all. For beginners or those who have been unsuccessful in the industry for a long time, I advise you to follow your dream and not listen to anyone else. Success is a matter of time and skill! Talent is not something you are born with; it is something you develop. Work on yourself and your work, put your heart into it, and you will succeed!Pavel Vorobiev, Texture ArtistInterview conducted by Gloria Levine
    #creating #detailed #helmet #inspired #fallout
    Creating a Detailed Helmet Inspired by Fallout Using Substance 3D
    IntroductionHi! My name is Pavel Vorobyev, and I'm a 19-year-old 3D Artist specializing in texturing and weapon creation for video games. I've been working in the industry for about 3 years now. During this time, I've had the opportunity to contribute to several exciting projects, including Arma, DayZ, Ratten Reich, and a NEXT-GEN sci-fi shooter. Here's my ArtStation portfolio.My journey into 3D art began in my early teens, around the age of 13 or 14. At some point, I got tired of just playing games and started wondering: "How are they actually made?" That question led me to explore game development. I tried everything – level design, programming, game design – but it was 3D art that truly captured me.I'm entirely self-taught. I learned everything from YouTube, tutorials, articles, and official documentation, gathering knowledge piece by piece. Breaking into the commercial side of the industry wasn't easy: there were a lot of failures, no opportunities, and no support. At one point, I even took a job at a metallurgical plant. But I kept pushing forward, kept learning and improving my skills in 3D. Eventually, I got my first industry offer – and that's when my real path began.Today, I continue to grow, constantly experimenting with new styles, tools, and techniques. For me, 3D isn't just a profession – it's a form of self-expression and a path toward my dream. My goal is to build a strong career in the game industry and eventually move into cinematic storytelling in the spirit of Love, Death & Robots.Astartes YouTube channelI also want to inspire younger artists and show how powerful texturing can be as a creative tool. To demonstrate that, I'd love to share my personal project PU – Part 1, which reflects my passion and approach to texture art.In this article, I'll be sharing my latest personal project – a semi-realistic sci-fi helmet that I created from scratch, experimenting with both form and style. It's a personal exploration where I aimed to step away from traditional hyperrealism and bring in a touch of artistic expression.Concept & Project IdeaThe idea behind this helmet project came from a very specific goal – to design a visually appealing asset with rich texture variation and achieve a balance between stylization and realism. I wanted to create something that looked believable, yet had an artistic flair. Since I couldn't find any fitting concepts online, I started building the design from scratch in my head. I eventually settled on creating a helmet as the main focus of the project. For visual direction, I drew inspiration from post-apocalyptic themes and the gritty aesthetics of Fallout and Warhammer 40,000.Software & Tools UsedFor this project, I used Blender, ZBrush, Substance 3D Painter, Marmoset Toolbag 5, Photoshop, and RizomUV. I created the low-poly mesh in Blender and developed the concept and high-poly sculpt in ZBrush. In Substance 3D Painter, I worked on the texture concept and final texturing. Baking and rendering were done in Marmoset Toolbag, and I used Photoshop for some adjustments to the bake. UV unwrapping was handled in RizomUV.Modeling & RetopologyI began the development process by designing the concept based on my earlier references – Fallout and Warhammer 40,000. The initial blockout was done in ZBrush, and from there, I started refining the shapes and details to create something visually engaging and stylistically bold.After completing the high-poly model, I moved on to the long and challenging process of retopology. Since I originally came from a weapons-focused background, I applied the knowledge I gained from modeling firearms. I slightly increased the polycount to achieve a cleaner and more appealing look in the final render – reducing visible faceting. My goal was to strike a balance between visual quality and a game-ready asset.UV Mapping & BakingNext, I moved on to UV mapping. There's nothing too complex about this stage, but since my goal was to create a game-ready asset, I made extensive use of overlaps. I did the UVs in Rizom UV. The most important part is to align the UV shells into clean strips and unwrap cylinders properly into straight lines.Once the UVs were done, I proceeded to bake the normal and ambient occlusion maps. At this stage, the key is having clean UVs and solid retopology – if those are in place, the bake goes smoothly. Texturing: Concept & WorkflowNow we move on to the most challenging stage – texturing. I aimed to present the project in a hyperrealistic style with a touch of stylization. This turned out to be quite difficult, and I went through many iterations. The most important part of this phase was developing a solid texture concept: rough decals, color combinations, and overall material direction. Without that foundation, it makes no sense to move forward with the texturing. After a long process of trial and error, I finally arrived at results I was satisfied with.Then I followed my pipeline:1. Working on the base materials2. Storytelling and damage3. Decals4. Spraying, dust, and dirtWorking on the Base MaterialsWhen working on the base materials, the main thing is to work with the physical properties and texture. You need to extract the maximum quality from the generators before manual processing. The idea was to create the feeling of an old, heavy helmet that had lived its life and had previously been painted a different color. To make it battered and, in a sense, rotten.It is important to pay attention to noise maps – Dirt 3, Dirt 6, White Noise, Flakes – and add the feel of old metal with custom Normal Maps. I also mixed in photo textures for a special charm. PhototextureCustom Normal Map TextureStorytelling & DamageGradients play an important role in the storytelling stage. They make the object artistically dynamic and beautiful, adding individual shades that bring the helmet to life.Everything else is done manually. I found a bunch of old helmets from World War II and took alpha damage shots of them using Photoshop. I drew the damage with alphas, trying to clearly separate the material into old paint, new paint, rust, and bare metal.I did the rust using MatFX Rust from the standard Substance 3D Painter library. I drew beautiful patterns using paint in multiply mode – this quickly helped to recreate the rust effect. Metal damage and old paint were more difficult: due to the large number of overlaps in the helmet, I had to carefully draw patterns, minimizing the visibility of overlaps.DecalsI drew the decals carefully, sticking to the concept, which added richness to the texture.Spray Paint & DirtFor spray paint and dirt, I used a long-established weapon template consisting of dust particles, sand particles, and spray paint. I analyzed references and applied them to crevices and logical places where dirt could accumulate.Rendering & Post-ProcessingI rendered in Marmoset Toolbag 5 using a new rendering format that I developed together with the team. The essence of the method is to simulate "RAW frames." Since Marmoset does not have such functions, I worked with the EXR 32-BIT format, which significantly improves the quality of the render: the shadows are smooth, without artifacts and broken gradients. I assembled the scene using Quixel Megascans. After rendering, I did post-processing in Photoshop utilizing Filter Camera Raw. Conclusion & Advice for BeginnersThat's all. For beginners or those who have been unsuccessful in the industry for a long time, I advise you to follow your dream and not listen to anyone else. Success is a matter of time and skill! Talent is not something you are born with; it is something you develop. Work on yourself and your work, put your heart into it, and you will succeed!Pavel Vorobiev, Texture ArtistInterview conducted by Gloria Levine #creating #detailed #helmet #inspired #fallout
    Creating a Detailed Helmet Inspired by Fallout Using Substance 3D
    80.lv
    IntroductionHi! My name is Pavel Vorobyev, and I'm a 19-year-old 3D Artist specializing in texturing and weapon creation for video games. I've been working in the industry for about 3 years now. During this time, I've had the opportunity to contribute to several exciting projects, including Arma, DayZ, Ratten Reich, and a NEXT-GEN sci-fi shooter (currently under NDA). Here's my ArtStation portfolio.My journey into 3D art began in my early teens, around the age of 13 or 14. At some point, I got tired of just playing games and started wondering: "How are they actually made?" That question led me to explore game development. I tried everything – level design, programming, game design – but it was 3D art that truly captured me.I'm entirely self-taught. I learned everything from YouTube, tutorials, articles, and official documentation, gathering knowledge piece by piece. Breaking into the commercial side of the industry wasn't easy: there were a lot of failures, no opportunities, and no support. At one point, I even took a job at a metallurgical plant. But I kept pushing forward, kept learning and improving my skills in 3D. Eventually, I got my first industry offer – and that's when my real path began.Today, I continue to grow, constantly experimenting with new styles, tools, and techniques. For me, 3D isn't just a profession – it's a form of self-expression and a path toward my dream. My goal is to build a strong career in the game industry and eventually move into cinematic storytelling in the spirit of Love, Death & Robots.Astartes YouTube channelI also want to inspire younger artists and show how powerful texturing can be as a creative tool. To demonstrate that, I'd love to share my personal project PU – Part 1, which reflects my passion and approach to texture art.In this article, I'll be sharing my latest personal project – a semi-realistic sci-fi helmet that I created from scratch, experimenting with both form and style. It's a personal exploration where I aimed to step away from traditional hyperrealism and bring in a touch of artistic expression.Concept & Project IdeaThe idea behind this helmet project came from a very specific goal – to design a visually appealing asset with rich texture variation and achieve a balance between stylization and realism. I wanted to create something that looked believable, yet had an artistic flair. Since I couldn't find any fitting concepts online, I started building the design from scratch in my head. I eventually settled on creating a helmet as the main focus of the project. For visual direction, I drew inspiration from post-apocalyptic themes and the gritty aesthetics of Fallout and Warhammer 40,000.Software & Tools UsedFor this project, I used Blender, ZBrush, Substance 3D Painter, Marmoset Toolbag 5, Photoshop, and RizomUV. I created the low-poly mesh in Blender and developed the concept and high-poly sculpt in ZBrush. In Substance 3D Painter, I worked on the texture concept and final texturing. Baking and rendering were done in Marmoset Toolbag, and I used Photoshop for some adjustments to the bake. UV unwrapping was handled in RizomUV.Modeling & RetopologyI began the development process by designing the concept based on my earlier references – Fallout and Warhammer 40,000. The initial blockout was done in ZBrush, and from there, I started refining the shapes and details to create something visually engaging and stylistically bold.After completing the high-poly model, I moved on to the long and challenging process of retopology. Since I originally came from a weapons-focused background, I applied the knowledge I gained from modeling firearms. I slightly increased the polycount to achieve a cleaner and more appealing look in the final render – reducing visible faceting. My goal was to strike a balance between visual quality and a game-ready asset.UV Mapping & BakingNext, I moved on to UV mapping. There's nothing too complex about this stage, but since my goal was to create a game-ready asset, I made extensive use of overlaps. I did the UVs in Rizom UV. The most important part is to align the UV shells into clean strips and unwrap cylinders properly into straight lines.Once the UVs were done, I proceeded to bake the normal and ambient occlusion maps. At this stage, the key is having clean UVs and solid retopology – if those are in place, the bake goes smoothly. Texturing: Concept & WorkflowNow we move on to the most challenging stage – texturing. I aimed to present the project in a hyperrealistic style with a touch of stylization. This turned out to be quite difficult, and I went through many iterations. The most important part of this phase was developing a solid texture concept: rough decals, color combinations, and overall material direction. Without that foundation, it makes no sense to move forward with the texturing. After a long process of trial and error, I finally arrived at results I was satisfied with.Then I followed my pipeline:1. Working on the base materials2. Storytelling and damage3. Decals4. Spraying, dust, and dirtWorking on the Base MaterialsWhen working on the base materials, the main thing is to work with the physical properties and texture. You need to extract the maximum quality from the generators before manual processing. The idea was to create the feeling of an old, heavy helmet that had lived its life and had previously been painted a different color. To make it battered and, in a sense, rotten.It is important to pay attention to noise maps – Dirt 3, Dirt 6, White Noise, Flakes – and add the feel of old metal with custom Normal Maps. I also mixed in photo textures for a special charm. PhototextureCustom Normal Map TextureStorytelling & DamageGradients play an important role in the storytelling stage. They make the object artistically dynamic and beautiful, adding individual shades that bring the helmet to life.Everything else is done manually. I found a bunch of old helmets from World War II and took alpha damage shots of them using Photoshop. I drew the damage with alphas, trying to clearly separate the material into old paint, new paint, rust, and bare metal.I did the rust using MatFX Rust from the standard Substance 3D Painter library. I drew beautiful patterns using paint in multiply mode – this quickly helped to recreate the rust effect. Metal damage and old paint were more difficult: due to the large number of overlaps in the helmet, I had to carefully draw patterns, minimizing the visibility of overlaps.DecalsI drew the decals carefully, sticking to the concept, which added richness to the texture.Spray Paint & DirtFor spray paint and dirt, I used a long-established weapon template consisting of dust particles, sand particles, and spray paint. I analyzed references and applied them to crevices and logical places where dirt could accumulate.Rendering & Post-ProcessingI rendered in Marmoset Toolbag 5 using a new rendering format that I developed together with the team. The essence of the method is to simulate "RAW frames." Since Marmoset does not have such functions, I worked with the EXR 32-BIT format, which significantly improves the quality of the render: the shadows are smooth, without artifacts and broken gradients. I assembled the scene using Quixel Megascans. After rendering, I did post-processing in Photoshop utilizing Filter Camera Raw. Conclusion & Advice for BeginnersThat's all. For beginners or those who have been unsuccessful in the industry for a long time, I advise you to follow your dream and not listen to anyone else. Success is a matter of time and skill! Talent is not something you are born with; it is something you develop. Work on yourself and your work, put your heart into it, and you will succeed!Pavel Vorobiev, Texture ArtistInterview conducted by Gloria Levine
    Like
    Love
    Wow
    Sad
    Angry
    701
    · 2 التعليقات ·0 المشاركات
  • يا جماعة، تذكرت واحد النهار كي كنت نلعب في لعبة farming sim، كاينة راحة ودنيا هادئة، وفجأة طاح عليا واحد robot mech كاين طاير! وهنا جاء لي في بالي فكرة المقال الجديد.

    اليوم جيت نحكي لكم على "Bounty Star"، اللي راح تخرج في 23 أكتوبر 2025 على منصة PS5. اللعبة هذي تجمع بين الأكشن والميكانيسم ديال المزارع، كأنك تخدم وتقاتل في نفس الوقت. الناشر Annapurna Interactive والمطور Dinogod أعلنوا على تاريخ الإصدار بعد انتظار طويل، يعني هي فعلاً كانت مشوقة!

    أنا بالنسبة لي، الفكرة هذي تجيب على خيال أي واحد يحب الألعاب، خاصة اللي يحبوا يبدعوا ويولّوا مزارعين بس بطريقة غير تقليدية.

    فكروا شحال راح يكون ممتع تجمعوا بين زراعة المحاصيل وزيارة الميكانيك!

    https://www.pushsquare.com/news/2025/08/mech-action-meets-farming-sim-in-bounty-star-finally-coming-to-ps5-in-o
    يا جماعة، تذكرت واحد النهار كي كنت نلعب في لعبة farming sim، كاينة راحة ودنيا هادئة، وفجأة طاح عليا واحد robot mech كاين طاير! 😂 وهنا جاء لي في بالي فكرة المقال الجديد. اليوم جيت نحكي لكم على "Bounty Star"، اللي راح تخرج في 23 أكتوبر 2025 على منصة PS5. اللعبة هذي تجمع بين الأكشن والميكانيسم ديال المزارع، كأنك تخدم وتقاتل في نفس الوقت. الناشر Annapurna Interactive والمطور Dinogod أعلنوا على تاريخ الإصدار بعد انتظار طويل، يعني هي فعلاً كانت مشوقة! أنا بالنسبة لي، الفكرة هذي تجيب على خيال أي واحد يحب الألعاب، خاصة اللي يحبوا يبدعوا ويولّوا مزارعين بس بطريقة غير تقليدية. فكروا شحال راح يكون ممتع تجمعوا بين زراعة المحاصيل وزيارة الميكانيك! https://www.pushsquare.com/news/2025/08/mech-action-meets-farming-sim-in-bounty-star-finally-coming-to-ps5-in-o
    www.pushsquare.com
    Mech-ing a living.Publisher Annapurna Interactive and developer Dinogod have finally announced a release date for Bounty Star, a game announced back in 2022.Combining mech-based action with serene farming and base building, the game launches 23rd Oct
    Like
    Love
    Wow
    Sad
    Angry
    1كيلو بايت
    · 1 التعليقات ·0 المشاركات
الصفحات المعززة
ollo https://www.ollo.ws