• في كل عيد ومناسبة، الأكل يولي جزء من ذكرياتنا. وهادي المرة نحب نحكي لكم عن "الأغواط" وحلويات "الدزار" اللي تزين ليالي الاحتفال بمولد خير الأنام.

    الفيديو يبين لنا كيف هاد العادة متوارثة من جيل لجيل، وكيف تتحضر الحلويات التقليدية التي تدفئ قلوبنا وتجمع العائلة حول المائدة. الرائحة، الألوان، والمذاق، كل شيء يحكي عن تاريخنا وثقافتنا.

    بالنسبة لي، كلما نتجمع مع العايلة ونتشارك هاد الحلويات، يترسخ فيا احساس العراقة والسعادة. ومع كل قضمة، نتذكر ليالي الطفولة، وضحكات الأصحاب.

    إنها لحظة للتفكير في قيمة العادات والتقاليد في حياتنا. هي فعلاً تخلق روابط قوية بين الأجيال وتخلينا نعيش اللحظة.

    https://www.youtube.com/watch?v=ZuEAgcEPqaI
    #الأغواط #حلويات_الدزار #Traditions #CulinaryHeritage #E
    🌙 في كل عيد ومناسبة، الأكل يولي جزء من ذكرياتنا. وهادي المرة نحب نحكي لكم عن "الأغواط" وحلويات "الدزار" اللي تزين ليالي الاحتفال بمولد خير الأنام. 🎉 الفيديو يبين لنا كيف هاد العادة متوارثة من جيل لجيل، وكيف تتحضر الحلويات التقليدية التي تدفئ قلوبنا وتجمع العائلة حول المائدة. الرائحة، الألوان، والمذاق، كل شيء يحكي عن تاريخنا وثقافتنا. بالنسبة لي، كلما نتجمع مع العايلة ونتشارك هاد الحلويات، يترسخ فيا احساس العراقة والسعادة. ومع كل قضمة، نتذكر ليالي الطفولة، وضحكات الأصحاب. إنها لحظة للتفكير في قيمة العادات والتقاليد في حياتنا. هي فعلاً تخلق روابط قوية بين الأجيال وتخلينا نعيش اللحظة. https://www.youtube.com/watch?v=ZuEAgcEPqaI #الأغواط #حلويات_الدزار #Traditions #CulinaryHeritage #E
    Like
    Love
    Wow
    Sad
    Angry
    964
    · 1 Comments ·0 Shares
  • واش راكم يا جماعة؟ اليوم حبّيت نشارك معاكم قصة عجيبة على مدينة صغيرة اسمها Hunacti في يوكاتان.

    فريق من الباحثين اكتشفوا تاريخ هذه المدينة اللي تأسست في القرن الـ16، وفيها شوارع حجرية وكنيسة على الطراز الإسباني. لكن القصة ما تتوقفش هنا! سكان Hunacti كانوا عندهم عزم كبير على الحفاظ على تقاليدهم الدينية المايائية، رغم كل الضغوطات اللي عاشوها في الفترة الاستعمارية.

    هذا الشي يفكرنا بقداش مهم أننا نتمسكوا بثقافتنا وتراثنا، مهما كانت التحديات. كل واحد فينا عندو قصة يرويها، ومكانه في الحفاظ على الهوية.

    إذا تحبوا تعرفوا أكثر، شوفوا الرابط هنا:
    https://www.ancientpages.com/2025/09/02/maya-town-hunacti-in-northern-yucatan-stood-defiantly-in-early-colonial-era/

    #ثقافة #Maya #Heritage #تاريخ #Traditions
    🌍 واش راكم يا جماعة؟ اليوم حبّيت نشارك معاكم قصة عجيبة على مدينة صغيرة اسمها Hunacti في يوكاتان. 🏛️ فريق من الباحثين اكتشفوا تاريخ هذه المدينة اللي تأسست في القرن الـ16، وفيها شوارع حجرية وكنيسة على الطراز الإسباني. لكن القصة ما تتوقفش هنا! سكان Hunacti كانوا عندهم عزم كبير على الحفاظ على تقاليدهم الدينية المايائية، رغم كل الضغوطات اللي عاشوها في الفترة الاستعمارية. هذا الشي يفكرنا بقداش مهم أننا نتمسكوا بثقافتنا وتراثنا، مهما كانت التحديات. كل واحد فينا عندو قصة يرويها، ومكانه في الحفاظ على الهوية. إذا تحبوا تعرفوا أكثر، شوفوا الرابط هنا: https://www.ancientpages.com/2025/09/02/maya-town-hunacti-in-northern-yucatan-stood-defiantly-in-early-colonial-era/ #ثقافة #Maya #Heritage #تاريخ #Traditions
    www.ancientpages.com
    Conny Waters - AncientPages.com - A team of researchers has uncovered the history of Hunacti, a 16th-century mission town featuring stone streets and a Spanish-style church. However, behind its appearance lies another story of this small town. It's a
    Like
    Love
    Wow
    Sad
    Angry
    620
    · 1 Comments ·0 Shares
  • في زمن الانشغالات الكثيرة، نحتاج أحيانًا نرجع لجذورنا ونتذكر بساطة الحياة الريفية. في فيديو جديد، نكتشف أجواء المولد النبوي في القرية، ونتجوّل في طبيعة ميلة وقسنطينة الرائعة.

    راح نعيشوا لحظات دافئة مع الخيرات الطازجة من المزرعة ، ونتذوقوا الأطباق التقليدية اللي تبين لنا حب بلادنا، كيما المقلوبة بالباذنجان والكسرة المطهوة على النار.

    "الحياة البسيطة هي سر السعادة الحقيقية." شخصيًا، كلما رحت للريف، نرجع مليان طاقة وحب للحياة.

    حلّو عينيكم واكتشفوا جمال الريف الجزائري، وجربوا تحضروا معانا هاد الأكلات اللذيذة.

    https://www.youtube.com/watch?v=NG5vANKdZgE
    #الريف_الجزائري #الحياة_البسيطة #AlgerianTraditions #SimpleLife #AuthenticAlger
    🌼 في زمن الانشغالات الكثيرة، نحتاج أحيانًا نرجع لجذورنا ونتذكر بساطة الحياة الريفية. في فيديو جديد، نكتشف أجواء المولد النبوي في القرية، ونتجوّل في طبيعة ميلة وقسنطينة الرائعة. 🏡✨ راح نعيشوا لحظات دافئة مع الخيرات الطازجة من المزرعة 🍅🍆🍇، ونتذوقوا الأطباق التقليدية اللي تبين لنا حب بلادنا، كيما المقلوبة بالباذنجان والكسرة المطهوة على النار. "الحياة البسيطة هي سر السعادة الحقيقية." ❤️ شخصيًا، كلما رحت للريف، نرجع مليان طاقة وحب للحياة. حلّو عينيكم واكتشفوا جمال الريف الجزائري، وجربوا تحضروا معانا هاد الأكلات اللذيذة. https://www.youtube.com/watch?v=NG5vANKdZgE #الريف_الجزائري #الحياة_البسيطة #AlgerianTraditions #SimpleLife #AuthenticAlger
    Like
    Love
    Wow
    Angry
    Sad
    731
    · 1 Comments ·0 Shares
  • هل عمرك سمعت على "سلوم البدو"؟ الفيديو الجديد في بودكاست "ذا قال" رح ياخدكم في رحلة عبر الأعراف اللي متكتبوش، لكن عندهم قيمة كبيرة في حياة البدو!

    في هذه الحلقة، نكتشف كيف أن الثقافة البدوية تدور حول الكرم والشجاعة، وشنو يعني "حرمة بيت الشَعر" و"الدخالة في الوجه". الشاعر رضا الرويلي يشدد على أهمية الحفاظ على لهجتنا وعاداتنا. كما قال: "الشمس ما تحجبهاش غيمة".

    بالنسبة لي، هاد الأعراف تخليني نفكر في هويتنا الأصلية وتاريخنا، وكفاش لازم نحافظوا عليه. كل واحد فينا عنده دور في الحفاظ على تراثنا.

    لا تفوتوا مشاهدة الفيديو، راح تتعلموا الكثير!
    https://www.youtube.com/watch?v=Jna1lXbPdY0

    #سلوم_البدو #الهوية_العربية #CultureMatters #Tradition #BedouinWisdom
    🌟 هل عمرك سمعت على "سلوم البدو"؟ الفيديو الجديد في بودكاست "ذا قال" رح ياخدكم في رحلة عبر الأعراف اللي متكتبوش، لكن عندهم قيمة كبيرة في حياة البدو! 🌍✨ في هذه الحلقة، نكتشف كيف أن الثقافة البدوية تدور حول الكرم والشجاعة، وشنو يعني "حرمة بيت الشَعر" و"الدخالة في الوجه". الشاعر رضا الرويلي يشدد على أهمية الحفاظ على لهجتنا وعاداتنا. كما قال: "الشمس ما تحجبهاش غيمة". بالنسبة لي، هاد الأعراف تخليني نفكر في هويتنا الأصلية وتاريخنا، وكفاش لازم نحافظوا عليه. كل واحد فينا عنده دور في الحفاظ على تراثنا. لا تفوتوا مشاهدة الفيديو، راح تتعلموا الكثير! https://www.youtube.com/watch?v=Jna1lXbPdY0 #سلوم_البدو #الهوية_العربية #CultureMatters #Tradition #BedouinWisdom
    Like
    Love
    Wow
    Sad
    Angry
    804
    · 1 Comments ·0 Shares
  • يا جماعة، اليوم حبيت نشارككم فيديو جديد يخلينا نعيشوا أجواء الأغواط والميلودية، اللي هي عادة محلية احتفالية بمولد خير الأنام.

    في الفيديو هذا، رح نكتشفوا مع بعض كيفاش الناس في الأغواط يحتفلوا ويعيشوا أجواء خاصّة، بحركات وألوان وتقاليد تعبر عن الفخر والهوية.

    شخصياً، كلما شفت هذ الاحتفالات، نتذكر ذكريات طفولتي مع العائلة والأصدقاء، كيف كنا نتجمعوا ونغنوا ونرقصوا مع بعض، كانت لحظات لا تنسى!

    في اللحظات هذي، نقدروا نتأملوا في قيمة التقاليد وكيفاش تساهم في توحيد المجتمعات وتعزيز الروابط بينهم.

    https://www.youtube.com/watch?v=TpbwuKEIYmw
    #الأغواط #الميلودية #Traditions #EchoroukNews #Célébration
    يا جماعة، اليوم حبيت نشارككم فيديو جديد يخلينا نعيشوا أجواء الأغواط والميلودية، اللي هي عادة محلية احتفالية بمولد خير الأنام. 🔥🎉 في الفيديو هذا، رح نكتشفوا مع بعض كيفاش الناس في الأغواط يحتفلوا ويعيشوا أجواء خاصّة، بحركات وألوان وتقاليد تعبر عن الفخر والهوية. 🙌🌟 شخصياً، كلما شفت هذ الاحتفالات، نتذكر ذكريات طفولتي مع العائلة والأصدقاء، كيف كنا نتجمعوا ونغنوا ونرقصوا مع بعض، كانت لحظات لا تنسى! ❤️ في اللحظات هذي، نقدروا نتأملوا في قيمة التقاليد وكيفاش تساهم في توحيد المجتمعات وتعزيز الروابط بينهم. https://www.youtube.com/watch?v=TpbwuKEIYmw #الأغواط #الميلودية #Traditions #EchoroukNews #Célébration
    0 Comments ·0 Shares
  • YouTube accusé d’appliquer des filtres sous IA aux contenus des utilisateurs

    Dès le mois de juin, les premiers avertissements inquiets commencent à apparaître sur le forum Reddit. Quelques créateurs s’alarment de l’apparence des vidéos publiées récemment sur YouTube : certains y décèlent un effet « peinture à l’huile », d’autres un aspect « plastique » leur faisant penser à la marque d’un traitement par intelligence artificielle. Toutefois, il faut attendre mercredi 20 août, cette fois sur le réseau social X, pour que le soupçon se confirme. En guise d’expérimentation, l’entreprise utilise bien un « procédé traditionnel de “machine learning”pour déflouter, “débruiter” et améliorer la clarté » d’une partie des « shorts », les vidéos courtes et verticales de YouTube, révèle alors Rene Ritchie, chargé pour YouTube des relations avec les créateurs. En employant le vocable « machine learning », la plateforme se garde d’employer les termes « intelligence artificielle », qui évoquent sans doute trop les IA génératives, outils apparus dans les années 2020 et qui permettent de créer des vidéos de toutes pièces sur la base d’une simple consigne textuelle. A l’inverse, le « machine learning » peut passer pour une technologie plus traditionnelle, car il est Lire aussi | Article réservé à nos abonnés Avec Veo 3, les vidéos générées par IA deviennent quasiment impossibles à distinguer des vraies Ce type d’outil est pourtant bien une intelligence artificielle. Il partage même avec les IA génératives des soubassements techniques communs de sorte qu’il ne serait pas surprenant que leurs signatures visuelles respectives puissent être confondues. Perte de confiance C’est la raison pour laquelle les nouveaux filtres de YouTube font craindre aux créateurs des conséquences négatives sur leur communauté. « J’ai un énorme problème avec ça », réagit ainsi Rhett Shull, youtubeur qui compte près de 750 000 abonnés. « En ce moment, l’IA est au cœur d’un vif débat. Des personnes sont farouchement hostiles à cette technologie, à cause de son impact environnemental » et des questions de propriété intellectuelle qu’elle soulève, expliquait-il dans une vidéo publiée vendredi 15 août. Il vous reste 45.32% de cet article à lire. La suite est réservée aux abonnés.
    #youtube #accusé #dappliquer #des #filtres
    YouTube accusé d’appliquer des filtres sous IA aux contenus des utilisateurs
    Dès le mois de juin, les premiers avertissements inquiets commencent à apparaître sur le forum Reddit. Quelques créateurs s’alarment de l’apparence des vidéos publiées récemment sur YouTube : certains y décèlent un effet « peinture à l’huile », d’autres un aspect « plastique » leur faisant penser à la marque d’un traitement par intelligence artificielle. Toutefois, il faut attendre mercredi 20 août, cette fois sur le réseau social X, pour que le soupçon se confirme. En guise d’expérimentation, l’entreprise utilise bien un « procédé traditionnel de “machine learning”pour déflouter, “débruiter” et améliorer la clarté » d’une partie des « shorts », les vidéos courtes et verticales de YouTube, révèle alors Rene Ritchie, chargé pour YouTube des relations avec les créateurs. En employant le vocable « machine learning », la plateforme se garde d’employer les termes « intelligence artificielle », qui évoquent sans doute trop les IA génératives, outils apparus dans les années 2020 et qui permettent de créer des vidéos de toutes pièces sur la base d’une simple consigne textuelle. A l’inverse, le « machine learning » peut passer pour une technologie plus traditionnelle, car il est Lire aussi | Article réservé à nos abonnés Avec Veo 3, les vidéos générées par IA deviennent quasiment impossibles à distinguer des vraies Ce type d’outil est pourtant bien une intelligence artificielle. Il partage même avec les IA génératives des soubassements techniques communs de sorte qu’il ne serait pas surprenant que leurs signatures visuelles respectives puissent être confondues. Perte de confiance C’est la raison pour laquelle les nouveaux filtres de YouTube font craindre aux créateurs des conséquences négatives sur leur communauté. « J’ai un énorme problème avec ça », réagit ainsi Rhett Shull, youtubeur qui compte près de 750 000 abonnés. « En ce moment, l’IA est au cœur d’un vif débat. Des personnes sont farouchement hostiles à cette technologie, à cause de son impact environnemental » et des questions de propriété intellectuelle qu’elle soulève, expliquait-il dans une vidéo publiée vendredi 15 août. Il vous reste 45.32% de cet article à lire. La suite est réservée aux abonnés. #youtube #accusé #dappliquer #des #filtres
    YouTube accusé d’appliquer des filtres sous IA aux contenus des utilisateurs
    www.lemonde.fr
    Dès le mois de juin, les premiers avertissements inquiets commencent à apparaître sur le forum Reddit. Quelques créateurs s’alarment de l’apparence des vidéos publiées récemment sur YouTube : certains y décèlent un effet « peinture à l’huile », d’autres un aspect « plastique » leur faisant penser à la marque d’un traitement par intelligence artificielle (IA). Toutefois, il faut attendre mercredi 20 août, cette fois sur le réseau social X, pour que le soupçon se confirme. En guise d’expérimentation, l’entreprise utilise bien un « procédé traditionnel de “machine learning” [“apprentissage automatique”] pour déflouter, “débruiter” et améliorer la clarté » d’une partie des « shorts », les vidéos courtes et verticales de YouTube, révèle alors Rene Ritchie, chargé pour YouTube des relations avec les créateurs. En employant le vocable « machine learning », la plateforme se garde d’employer les termes « intelligence artificielle », qui évoquent sans doute trop les IA génératives, outils apparus dans les années 2020 et qui permettent de créer des vidéos de toutes pièces sur la base d’une simple consigne textuelle. A l’inverse, le « machine learning » peut passer pour une technologie plus traditionnelle, car il est Lire aussi | Article réservé à nos abonnés Avec Veo 3, les vidéos générées par IA deviennent quasiment impossibles à distinguer des vraies Ce type d’outil est pourtant bien une intelligence artificielle. Il partage même avec les IA génératives des soubassements techniques communs de sorte qu’il ne serait pas surprenant que leurs signatures visuelles respectives puissent être confondues. Perte de confiance C’est la raison pour laquelle les nouveaux filtres de YouTube font craindre aux créateurs des conséquences négatives sur leur communauté. « J’ai un énorme problème avec ça », réagit ainsi Rhett Shull, youtubeur qui compte près de 750 000 abonnés. « En ce moment, l’IA est au cœur d’un vif débat (…). Des personnes sont farouchement hostiles à cette technologie, à cause de son impact environnemental » et des questions de propriété intellectuelle qu’elle soulève, expliquait-il dans une vidéo publiée vendredi 15 août. Il vous reste 45.32% de cet article à lire. La suite est réservée aux abonnés.
    Like
    Love
    Wow
    Sad
    Angry
    830
    · 2 Comments ·0 Shares
  • Romeo is a Dead Man: A sneak peak of what to expect

    What’s up, everyone? I’m gonna assume you’ve already seen the announcement trailer for Grasshopper Manufacture’s all-new title, Romeo Is A Dead Man. If not, then do yourself a favor and go watch it now. It’s cool – I’ll wait two and a half minutes.

    Play Video

    OK, so you get that there’s gonna be a whole lot of extremely bloody battle action and exploring some weird places, but I think a lot of people may be confused by the sheer amount of information packed into two and a half minutes… Today, we’ll give you a teensy little glimpse of how Romeo Stargazer – aka “DeadMan”, a special agent in the FBI division known as the Space-Time Police – goes about his “investigations”.

    Romeo Is A Dead Man, abbreviated as… I don’t know, RiaDM? or maybe RoDeMa, if you’re nasty? Anyway, one of the most notable features of the game is the rich variety of graphic styles used to depict the game world. Seriously, it’s all over the place – but like, in a good way. The meticulously-tweaked action parts are done in stunning, almost photorealistic 3D, and we’ve thrown everything but the kitchen sink into the more story-based parts.

    And don’t worry, GhM fans – we promise: for as much work as we’ve put into making the game look cool and unique, the story itself is also ridiculously bonkers, as is tradition here at Grasshopper Manufacture. We think longtime fans will enjoy it, and newcomers will have their heads exploding. Either way, you’re guaranteed to see some stuff you’ve never seen before.

    As for the actual battles, our hero Romeo is heavily armed with both katana-style melee weapons and gun-style ranged weapons alike, which the player can switch between while dispersing beatdowns. However even the weaker, goombah-type enemies are pretty hardcore. You’re gonna have to think up combinations of melee, ranged, heavy, and light attacks to get by. But the stupidly gratuitous amount of blood splatter and catharsis you’re rewarded with when landing a real nuclear power move of a combo is awe-inspiring, if that’s your thing. On top of the kinda-humanoid creatures you’ve already seen, known as “Rotters”, we’ve got all kinds of other ultra-creepy, unique enemies waiting to bite your face off!

    Now, let’s look at one of the main centerpieces of any GhM game: the boss battles. This particular boss is, well, hella big. His name is “Everyday Is Like Monday”, because of course it is. It’s on you to make sure Romeo can dodge the mess of attacks launched by this big-ass tyrant and take him down to Chinatown. It’s one of the most feelgood beatdowns of the year!

    Also, being a member of something called the “Space-Time Police” means that obviously Romeo is gonna be visiting all sorts of weird, “…what?”-type places. And awaiting him at these weird, “…what?”-type places are a range of weird, “…what?”-type puzzles that only the highest double-digit IQ players will be able to solve! This thing looks like a simple sphere that someone just kinda dropped and busted, but once you really wrap your dome around it and get it solved, damn it feels good. There are a slew of other puzzles and gimmicks strategically or possibly just randomly strewn throughout the game, so keep your eyeballs peeled for them and try not to break any controllers as you encounter them along your mission.

    That’s all for now, but obviously there are still a whole bunch of important game elements we have yet to discuss, so stay tuned for next time!
    #romeo #dead #man #sneak #peak
    Romeo is a Dead Man: A sneak peak of what to expect
    What’s up, everyone? I’m gonna assume you’ve already seen the announcement trailer for Grasshopper Manufacture’s all-new title, Romeo Is A Dead Man. If not, then do yourself a favor and go watch it now. It’s cool – I’ll wait two and a half minutes. Play Video OK, so you get that there’s gonna be a whole lot of extremely bloody battle action and exploring some weird places, but I think a lot of people may be confused by the sheer amount of information packed into two and a half minutes… Today, we’ll give you a teensy little glimpse of how Romeo Stargazer – aka “DeadMan”, a special agent in the FBI division known as the Space-Time Police – goes about his “investigations”. Romeo Is A Dead Man, abbreviated as… I don’t know, RiaDM? or maybe RoDeMa, if you’re nasty? Anyway, one of the most notable features of the game is the rich variety of graphic styles used to depict the game world. Seriously, it’s all over the place – but like, in a good way. The meticulously-tweaked action parts are done in stunning, almost photorealistic 3D, and we’ve thrown everything but the kitchen sink into the more story-based parts. And don’t worry, GhM fans – we promise: for as much work as we’ve put into making the game look cool and unique, the story itself is also ridiculously bonkers, as is tradition here at Grasshopper Manufacture. We think longtime fans will enjoy it, and newcomers will have their heads exploding. Either way, you’re guaranteed to see some stuff you’ve never seen before. As for the actual battles, our hero Romeo is heavily armed with both katana-style melee weapons and gun-style ranged weapons alike, which the player can switch between while dispersing beatdowns. However even the weaker, goombah-type enemies are pretty hardcore. You’re gonna have to think up combinations of melee, ranged, heavy, and light attacks to get by. But the stupidly gratuitous amount of blood splatter and catharsis you’re rewarded with when landing a real nuclear power move of a combo is awe-inspiring, if that’s your thing. On top of the kinda-humanoid creatures you’ve already seen, known as “Rotters”, we’ve got all kinds of other ultra-creepy, unique enemies waiting to bite your face off! Now, let’s look at one of the main centerpieces of any GhM game: the boss battles. This particular boss is, well, hella big. His name is “Everyday Is Like Monday”, because of course it is. It’s on you to make sure Romeo can dodge the mess of attacks launched by this big-ass tyrant and take him down to Chinatown. It’s one of the most feelgood beatdowns of the year! Also, being a member of something called the “Space-Time Police” means that obviously Romeo is gonna be visiting all sorts of weird, “…what?”-type places. And awaiting him at these weird, “…what?”-type places are a range of weird, “…what?”-type puzzles that only the highest double-digit IQ players will be able to solve! This thing looks like a simple sphere that someone just kinda dropped and busted, but once you really wrap your dome around it and get it solved, damn it feels good. There are a slew of other puzzles and gimmicks strategically or possibly just randomly strewn throughout the game, so keep your eyeballs peeled for them and try not to break any controllers as you encounter them along your mission. That’s all for now, but obviously there are still a whole bunch of important game elements we have yet to discuss, so stay tuned for next time! #romeo #dead #man #sneak #peak
    Romeo is a Dead Man: A sneak peak of what to expect
    blog.playstation.com
    What’s up, everyone? I’m gonna assume you’ve already seen the announcement trailer for Grasshopper Manufacture’s all-new title, Romeo Is A Dead Man. If not, then do yourself a favor and go watch it now. It’s cool – I’ll wait two and a half minutes. Play Video OK, so you get that there’s gonna be a whole lot of extremely bloody battle action and exploring some weird places, but I think a lot of people may be confused by the sheer amount of information packed into two and a half minutes… Today, we’ll give you a teensy little glimpse of how Romeo Stargazer – aka “DeadMan”, a special agent in the FBI division known as the Space-Time Police – goes about his “investigations”. Romeo Is A Dead Man, abbreviated as… I don’t know, RiaDM? or maybe RoDeMa, if you’re nasty? Anyway, one of the most notable features of the game is the rich variety of graphic styles used to depict the game world. Seriously, it’s all over the place – but like, in a good way. The meticulously-tweaked action parts are done in stunning, almost photorealistic 3D, and we’ve thrown everything but the kitchen sink into the more story-based parts. And don’t worry, GhM fans – we promise: for as much work as we’ve put into making the game look cool and unique, the story itself is also ridiculously bonkers, as is tradition here at Grasshopper Manufacture. We think longtime fans will enjoy it, and newcomers will have their heads exploding. Either way, you’re guaranteed to see some stuff you’ve never seen before. As for the actual battles, our hero Romeo is heavily armed with both katana-style melee weapons and gun-style ranged weapons alike, which the player can switch between while dispersing beatdowns. However even the weaker, goombah-type enemies are pretty hardcore. You’re gonna have to think up combinations of melee, ranged, heavy, and light attacks to get by. But the stupidly gratuitous amount of blood splatter and catharsis you’re rewarded with when landing a real nuclear power move of a combo is awe-inspiring, if that’s your thing. On top of the kinda-humanoid creatures you’ve already seen, known as “Rotters”, we’ve got all kinds of other ultra-creepy, unique enemies waiting to bite your face off! Now, let’s look at one of the main centerpieces of any GhM game: the boss battles. This particular boss is, well, hella big. His name is “Everyday Is Like Monday”, because of course it is. It’s on you to make sure Romeo can dodge the mess of attacks launched by this big-ass tyrant and take him down to Chinatown. It’s one of the most feelgood beatdowns of the year! Also, being a member of something called the “Space-Time Police” means that obviously Romeo is gonna be visiting all sorts of weird, “…what?”-type places. And awaiting him at these weird, “…what?”-type places are a range of weird, “…what?”-type puzzles that only the highest double-digit IQ players will be able to solve! This thing looks like a simple sphere that someone just kinda dropped and busted, but once you really wrap your dome around it and get it solved, damn it feels good. There are a slew of other puzzles and gimmicks strategically or possibly just randomly strewn throughout the game, so keep your eyeballs peeled for them and try not to break any controllers as you encounter them along your mission. That’s all for now, but obviously there are still a whole bunch of important game elements we have yet to discuss, so stay tuned for next time!
    Like
    Love
    Wow
    Sad
    Angry
    773
    · 2 Comments ·0 Shares
  • NVIDIA Jetson Thor Unlocks Real-Time Reasoning for General Robotics and Physical AI

    Robots around the world are about to get a lot smarter as physical AI developers plug in NVIDIA Jetson Thor modules — new robotics computers that can serve as the brains for robotic systems across research and industry.
    Robots demand rich sensor data and low-latency AI processing. Running real-time robotic applications requires significant AI compute and memory to handle concurrent data streams from multiple sensors. Jetson Thor, now in general availability, delivers 7.5x more AI compute, 3.1x more CPU performance and 2x more memory than its predecessor, the NVIDIA Jetson Orin, to make this possible on device.
    This performance leap will enable roboticists to process high-speed sensor data and perform visual reasoning at the edge — workflows that were previously too slow to run in dynamic real-world environments. This opens new possibilities for multimodal AI applications such as humanoid robotics.

    Agility Robotics, a leader in humanoid robotics, has integrated NVIDIA Jetson into the fifth generation of its robot, Digit — and plans to adopt Jetson Thor as the onboard compute platform for the sixth generation of Digit. This transition will enhance Digit’s real-time perception and decision-making capabilities, supporting increasingly complex AI skills and behaviors. Digit is commercially deployed and performs logistics tasks such as stacking, loading and palletizing in warehouse and manufacturing environments.
    “The powerful edge processing offered by Jetson Thor will take Digit to the next level — enhancing its real-time responsiveness and expanding its abilities to a broader, more complex set of skills,” said Peggy Johnson, CEO of Agility Robotics. “With Jetson Thor, we can deliver the latest physical AI advancements to optimize operations across our customers’ warehouses and factories.”
    Boston Dynamics — which has been building some of the industry’s most advanced robots for over 30 years — is integrating Jetson Thor into its humanoid robot Atlas, enabling Atlas to harness formerly server-level compute, AI workload acceleration, high-bandwidth data processing and significant memory on device.
    Beyond humanoids, Jetson Thor will accelerate various robotic applications — such as surgical assistants, smart tractors, delivery robots, industrial manipulators and visual AI agents — with real-time inference on device for larger, more complex AI models.
    A Giant Leap for Real-Time Robot Reasoning
    Jetson Thor is built for generative reasoning models. It enables the next generation of physical AI agents — powered by large transformer models, vision language models and vision language action models — to run in real time at the edge while minimizing cloud dependency.
    Optimized with the Jetson software stack to enable the low latency and high performance required in real-world applications, Jetson Thor supports all popular generative AI frameworks and AI reasoning models with unmatched real-time performance. These include Cosmos Reason, DeepSeek, Llama, Gemini and Qwen models, as well as domain-specific models for robotics like Isaac GR00T N1.5, enabling any developer to easily experiment and run inference locally.
    NVIDIA Jetson Thor opens new capabilities for real-time reasoning with multi-sensor input. Further performance improvement is expected with FP4 and speculative decoding optimization.
    With NVIDIA CUDA ecosystem support through its lifecycle, Jetson Thor is expected to deliver even better throughput and faster responses with future software releases.
    Jetson Thor modules also run the full NVIDIA AI software stack to accelerate virtually every physical AI workflow with platforms including NVIDIA Isaac for robotics, NVIDIA Metropolis for video analytics AI agents and NVIDIA Holoscan for sensor processing.
    With these software tools, developers can easily build and deploy applications, such as visual AI agents that can analyze live camera streams to monitor worker safety, humanoid robots capable of manipulation tasks in unstructured environments and smart operating rooms that guide surgeons based on data from multi-camera streams.
    Jetson Thor Set to Advance Research Innovation 
    Research labs at Stanford University, Carnegie Mellon University and the University of Zurich are tapping Jetson Thor to push the boundaries of perception, planning and navigation models for a host of potential applications.
    At Carnegie Mellon’s Robotics Institute, a research team uses NVIDIA Jetson to power autonomous robots that can navigate complex, unstructured environments to conduct medical triage as well as search and rescue.
    “We can only do as much as the compute available allows,” said Sebastian Scherer, an associate research professor at the university and head of the AirLab. “Years ago, there was a big disconnect between computer vision and robotics because computer vision workloads were too slow for real-time decision-making — but now, models and computing have gotten fast enough so robots can handle much more nuanced tasks.”
    Scherer anticipates that by upgrading from his team’s existing NVIDIA Jetson AGX Orin systems to Jetson AGX Thor developer kit, they’ll improve the performance of AI models including their award-winning MAC-VO model for robot perception at the edge, boost their sensor-fusion capabilities and be able to experiment with robot fleets.
    Wield the Strength of Jetson Thor
    The Jetson Thor family includes a developer kit and production modules. The developer kit includes a Jetson T5000 module, a reference carrier board with abundant connectivity, an active heatsink with a fan and a power supply.
    NVIDIA Jetson AGX Thor Developer Kit
    The Jetson ecosystem supports a variety of application requirements, high-speed industrial automation protocols and sensor interfaces, accelerating time to market for enterprise developers. Hardware partners including Advantech, Aetina, ConnectTech, MiiVii and TZTEK are building production-ready Jetson Thor systems with flexible I/O and custom configurations in various form factors.
    Sensor and Actuator companies including Analog Devices, Inc., e-con Systems,  Infineon, Leopard Imaging, RealSense and Sensing are using NVIDIA Holoscan Sensor Bridge — a platform that simplifies sensor fusion and data streaming — to connect sensor data from cameras, radar, lidar and more directly to GPU memory on Jetson Thor with ultralow latency.
    Thousands of software companies can now elevate their traditional vision AI and robotics applications with multi-AI agent workflows running on Jetson Thor. Leading adopters include Openzeka, Rebotnix, Solomon and Vaidio.
    More than 2 million developers use NVIDIA technologies to accelerate robotics workflows. Get started with Jetson Thor by reading the NVIDIA Technical Blog and watching the developer kit walkthrough.

    To get hands-on experience with Jetson Thor, sign up to participate in upcoming hackathons with Seeed Studio and LeRobot by Hugging Face.
    The NVIDIA Jetson AGX Thor developer kit is available now starting at NVIDIA Jetson T5000 modules are available starting at for 1,000 units. Buy now from authorized NVIDIA partners.
    NVIDIA today also announced that the NVIDIA DRIVE AGX Thor developer kit, which provides a platform for developing autonomous vehicles and mobility solutions, is available for preorder. Deliveries are slated to start in September.
    #nvidia #jetson #thor #unlocks #realtime
    NVIDIA Jetson Thor Unlocks Real-Time Reasoning for General Robotics and Physical AI
    Robots around the world are about to get a lot smarter as physical AI developers plug in NVIDIA Jetson Thor modules — new robotics computers that can serve as the brains for robotic systems across research and industry. Robots demand rich sensor data and low-latency AI processing. Running real-time robotic applications requires significant AI compute and memory to handle concurrent data streams from multiple sensors. Jetson Thor, now in general availability, delivers 7.5x more AI compute, 3.1x more CPU performance and 2x more memory than its predecessor, the NVIDIA Jetson Orin, to make this possible on device. This performance leap will enable roboticists to process high-speed sensor data and perform visual reasoning at the edge — workflows that were previously too slow to run in dynamic real-world environments. This opens new possibilities for multimodal AI applications such as humanoid robotics. Agility Robotics, a leader in humanoid robotics, has integrated NVIDIA Jetson into the fifth generation of its robot, Digit — and plans to adopt Jetson Thor as the onboard compute platform for the sixth generation of Digit. This transition will enhance Digit’s real-time perception and decision-making capabilities, supporting increasingly complex AI skills and behaviors. Digit is commercially deployed and performs logistics tasks such as stacking, loading and palletizing in warehouse and manufacturing environments. “The powerful edge processing offered by Jetson Thor will take Digit to the next level — enhancing its real-time responsiveness and expanding its abilities to a broader, more complex set of skills,” said Peggy Johnson, CEO of Agility Robotics. “With Jetson Thor, we can deliver the latest physical AI advancements to optimize operations across our customers’ warehouses and factories.” Boston Dynamics — which has been building some of the industry’s most advanced robots for over 30 years — is integrating Jetson Thor into its humanoid robot Atlas, enabling Atlas to harness formerly server-level compute, AI workload acceleration, high-bandwidth data processing and significant memory on device. Beyond humanoids, Jetson Thor will accelerate various robotic applications — such as surgical assistants, smart tractors, delivery robots, industrial manipulators and visual AI agents — with real-time inference on device for larger, more complex AI models. A Giant Leap for Real-Time Robot Reasoning Jetson Thor is built for generative reasoning models. It enables the next generation of physical AI agents — powered by large transformer models, vision language models and vision language action models — to run in real time at the edge while minimizing cloud dependency. Optimized with the Jetson software stack to enable the low latency and high performance required in real-world applications, Jetson Thor supports all popular generative AI frameworks and AI reasoning models with unmatched real-time performance. These include Cosmos Reason, DeepSeek, Llama, Gemini and Qwen models, as well as domain-specific models for robotics like Isaac GR00T N1.5, enabling any developer to easily experiment and run inference locally. NVIDIA Jetson Thor opens new capabilities for real-time reasoning with multi-sensor input. Further performance improvement is expected with FP4 and speculative decoding optimization. With NVIDIA CUDA ecosystem support through its lifecycle, Jetson Thor is expected to deliver even better throughput and faster responses with future software releases. Jetson Thor modules also run the full NVIDIA AI software stack to accelerate virtually every physical AI workflow with platforms including NVIDIA Isaac for robotics, NVIDIA Metropolis for video analytics AI agents and NVIDIA Holoscan for sensor processing. With these software tools, developers can easily build and deploy applications, such as visual AI agents that can analyze live camera streams to monitor worker safety, humanoid robots capable of manipulation tasks in unstructured environments and smart operating rooms that guide surgeons based on data from multi-camera streams. Jetson Thor Set to Advance Research Innovation  Research labs at Stanford University, Carnegie Mellon University and the University of Zurich are tapping Jetson Thor to push the boundaries of perception, planning and navigation models for a host of potential applications. At Carnegie Mellon’s Robotics Institute, a research team uses NVIDIA Jetson to power autonomous robots that can navigate complex, unstructured environments to conduct medical triage as well as search and rescue. “We can only do as much as the compute available allows,” said Sebastian Scherer, an associate research professor at the university and head of the AirLab. “Years ago, there was a big disconnect between computer vision and robotics because computer vision workloads were too slow for real-time decision-making — but now, models and computing have gotten fast enough so robots can handle much more nuanced tasks.” Scherer anticipates that by upgrading from his team’s existing NVIDIA Jetson AGX Orin systems to Jetson AGX Thor developer kit, they’ll improve the performance of AI models including their award-winning MAC-VO model for robot perception at the edge, boost their sensor-fusion capabilities and be able to experiment with robot fleets. Wield the Strength of Jetson Thor The Jetson Thor family includes a developer kit and production modules. The developer kit includes a Jetson T5000 module, a reference carrier board with abundant connectivity, an active heatsink with a fan and a power supply. NVIDIA Jetson AGX Thor Developer Kit The Jetson ecosystem supports a variety of application requirements, high-speed industrial automation protocols and sensor interfaces, accelerating time to market for enterprise developers. Hardware partners including Advantech, Aetina, ConnectTech, MiiVii and TZTEK are building production-ready Jetson Thor systems with flexible I/O and custom configurations in various form factors. Sensor and Actuator companies including Analog Devices, Inc., e-con Systems,  Infineon, Leopard Imaging, RealSense and Sensing are using NVIDIA Holoscan Sensor Bridge — a platform that simplifies sensor fusion and data streaming — to connect sensor data from cameras, radar, lidar and more directly to GPU memory on Jetson Thor with ultralow latency. Thousands of software companies can now elevate their traditional vision AI and robotics applications with multi-AI agent workflows running on Jetson Thor. Leading adopters include Openzeka, Rebotnix, Solomon and Vaidio. More than 2 million developers use NVIDIA technologies to accelerate robotics workflows. Get started with Jetson Thor by reading the NVIDIA Technical Blog and watching the developer kit walkthrough. To get hands-on experience with Jetson Thor, sign up to participate in upcoming hackathons with Seeed Studio and LeRobot by Hugging Face. The NVIDIA Jetson AGX Thor developer kit is available now starting at NVIDIA Jetson T5000 modules are available starting at for 1,000 units. Buy now from authorized NVIDIA partners. NVIDIA today also announced that the NVIDIA DRIVE AGX Thor developer kit, which provides a platform for developing autonomous vehicles and mobility solutions, is available for preorder. Deliveries are slated to start in September. #nvidia #jetson #thor #unlocks #realtime
    NVIDIA Jetson Thor Unlocks Real-Time Reasoning for General Robotics and Physical AI
    blogs.nvidia.com
    Robots around the world are about to get a lot smarter as physical AI developers plug in NVIDIA Jetson Thor modules — new robotics computers that can serve as the brains for robotic systems across research and industry. Robots demand rich sensor data and low-latency AI processing. Running real-time robotic applications requires significant AI compute and memory to handle concurrent data streams from multiple sensors. Jetson Thor, now in general availability, delivers 7.5x more AI compute, 3.1x more CPU performance and 2x more memory than its predecessor, the NVIDIA Jetson Orin, to make this possible on device. This performance leap will enable roboticists to process high-speed sensor data and perform visual reasoning at the edge — workflows that were previously too slow to run in dynamic real-world environments. This opens new possibilities for multimodal AI applications such as humanoid robotics. Agility Robotics, a leader in humanoid robotics, has integrated NVIDIA Jetson into the fifth generation of its robot, Digit — and plans to adopt Jetson Thor as the onboard compute platform for the sixth generation of Digit. This transition will enhance Digit’s real-time perception and decision-making capabilities, supporting increasingly complex AI skills and behaviors. Digit is commercially deployed and performs logistics tasks such as stacking, loading and palletizing in warehouse and manufacturing environments. “The powerful edge processing offered by Jetson Thor will take Digit to the next level — enhancing its real-time responsiveness and expanding its abilities to a broader, more complex set of skills,” said Peggy Johnson, CEO of Agility Robotics. “With Jetson Thor, we can deliver the latest physical AI advancements to optimize operations across our customers’ warehouses and factories.” Boston Dynamics — which has been building some of the industry’s most advanced robots for over 30 years — is integrating Jetson Thor into its humanoid robot Atlas, enabling Atlas to harness formerly server-level compute, AI workload acceleration, high-bandwidth data processing and significant memory on device. Beyond humanoids, Jetson Thor will accelerate various robotic applications — such as surgical assistants, smart tractors, delivery robots, industrial manipulators and visual AI agents — with real-time inference on device for larger, more complex AI models. A Giant Leap for Real-Time Robot Reasoning Jetson Thor is built for generative reasoning models. It enables the next generation of physical AI agents — powered by large transformer models, vision language models and vision language action models — to run in real time at the edge while minimizing cloud dependency. Optimized with the Jetson software stack to enable the low latency and high performance required in real-world applications, Jetson Thor supports all popular generative AI frameworks and AI reasoning models with unmatched real-time performance. These include Cosmos Reason, DeepSeek, Llama, Gemini and Qwen models, as well as domain-specific models for robotics like Isaac GR00T N1.5, enabling any developer to easily experiment and run inference locally. NVIDIA Jetson Thor opens new capabilities for real-time reasoning with multi-sensor input. Further performance improvement is expected with FP4 and speculative decoding optimization. With NVIDIA CUDA ecosystem support through its lifecycle, Jetson Thor is expected to deliver even better throughput and faster responses with future software releases. Jetson Thor modules also run the full NVIDIA AI software stack to accelerate virtually every physical AI workflow with platforms including NVIDIA Isaac for robotics, NVIDIA Metropolis for video analytics AI agents and NVIDIA Holoscan for sensor processing. With these software tools, developers can easily build and deploy applications, such as visual AI agents that can analyze live camera streams to monitor worker safety, humanoid robots capable of manipulation tasks in unstructured environments and smart operating rooms that guide surgeons based on data from multi-camera streams. Jetson Thor Set to Advance Research Innovation  Research labs at Stanford University, Carnegie Mellon University and the University of Zurich are tapping Jetson Thor to push the boundaries of perception, planning and navigation models for a host of potential applications. At Carnegie Mellon’s Robotics Institute, a research team uses NVIDIA Jetson to power autonomous robots that can navigate complex, unstructured environments to conduct medical triage as well as search and rescue. “We can only do as much as the compute available allows,” said Sebastian Scherer, an associate research professor at the university and head of the AirLab. “Years ago, there was a big disconnect between computer vision and robotics because computer vision workloads were too slow for real-time decision-making — but now, models and computing have gotten fast enough so robots can handle much more nuanced tasks.” Scherer anticipates that by upgrading from his team’s existing NVIDIA Jetson AGX Orin systems to Jetson AGX Thor developer kit, they’ll improve the performance of AI models including their award-winning MAC-VO model for robot perception at the edge, boost their sensor-fusion capabilities and be able to experiment with robot fleets. Wield the Strength of Jetson Thor The Jetson Thor family includes a developer kit and production modules. The developer kit includes a Jetson T5000 module, a reference carrier board with abundant connectivity, an active heatsink with a fan and a power supply. NVIDIA Jetson AGX Thor Developer Kit The Jetson ecosystem supports a variety of application requirements, high-speed industrial automation protocols and sensor interfaces, accelerating time to market for enterprise developers. Hardware partners including Advantech, Aetina, ConnectTech, MiiVii and TZTEK are building production-ready Jetson Thor systems with flexible I/O and custom configurations in various form factors. Sensor and Actuator companies including Analog Devices, Inc. (ADI), e-con Systems,  Infineon, Leopard Imaging, RealSense and Sensing are using NVIDIA Holoscan Sensor Bridge — a platform that simplifies sensor fusion and data streaming — to connect sensor data from cameras, radar, lidar and more directly to GPU memory on Jetson Thor with ultralow latency. Thousands of software companies can now elevate their traditional vision AI and robotics applications with multi-AI agent workflows running on Jetson Thor. Leading adopters include Openzeka, Rebotnix, Solomon and Vaidio. More than 2 million developers use NVIDIA technologies to accelerate robotics workflows. Get started with Jetson Thor by reading the NVIDIA Technical Blog and watching the developer kit walkthrough. To get hands-on experience with Jetson Thor, sign up to participate in upcoming hackathons with Seeed Studio and LeRobot by Hugging Face. The NVIDIA Jetson AGX Thor developer kit is available now starting at $3,499. NVIDIA Jetson T5000 modules are available starting at $2,999 for 1,000 units. Buy now from authorized NVIDIA partners. NVIDIA today also announced that the NVIDIA DRIVE AGX Thor developer kit, which provides a platform for developing autonomous vehicles and mobility solutions, is available for preorder. Deliveries are slated to start in September.
    Like
    Love
    Wow
    Sad
    Angry
    797
    · 2 Comments ·0 Shares
  • Gaming Meets Streaming: Inside the Shift

    After a long, busy day, you boot up your gaming device but don’t quite feel like diving into an intense session. Instead, you open a broadcast of one of your favorite streamers and spend the evening laughing at commentary, reacting to unexpected moments, and just enjoying your time with fellow gamers. Sounds familiar?This everyday scenario perfectly captures the way live streaming platforms like Twitch, YouTube Gaming, or Kick have transformed the gaming experience — turning gameplay into shared moments where gamers broadcast in real-time while viewers watch, chat, learn, and discover new titles.What started as friends sharing gameplay clips has exploded into a multi-billion-dollar ecosystem where streamers are popular creators, viewers build communities around shared experiences, and watching games has become as popular as playing them. But how did streaming become such a powerful force in gaming – and what does it mean for players, creators, and the industry alike? Let’s find out!Why Do Gamers Love Streaming?So why are millions of gamers spending hours every week watching others play instead of jumping into a game themselves? The answer isn’t just one thing – it’s a mix of entertainment, learning, connection, and discovery that makes live streaming uniquely compelling. Let’s break it down.Entertainment at Your Own PaceSometimes, you just want to relax. Maybe you’re too mentally drained to queue up for ranked matches or start that complex RPG quest. Streaming offers the perfect low-effort alternative – the fun of gaming without needing to press a single button. Whether it's high-stakes gameplay, hilarious commentary, or unpredictable in-game chaos, streams let you enjoy all the excitement while kicking back on the couch, grabbing a snack, or chatting in the background.Learning and Skill DevelopmentStreaming isn’t just for laughs – it’s also one of the best ways to level up your own gameplay. Watching a skilled streamer handle a tricky boss fight, execute high-level strategies, or master a game’s mechanics can teach you far more than a dry tutorial ever could. Many gamers tune in specifically to study routes, tactics, builds, or even to understand if a game suits their playstyle before buying it. Think of it as education, but way more fun.Social Connection and CommunityOne of the most powerful draws of live streaming is the sense of community. Jumping into a stream isn’t like watching TV – it’s like entering a room full of people who love the same games you do. Chatting with fellow viewers, sharing reactions in real-time, tossing emotes into the chaos, and getting shoutouts from the streamer – it all creates a sense of belonging. For many, it’s a go-to social space where friendships, inside jokes, and even fandoms grow.Discovery of New Games and TrendsEver found a game you now love just because you saw a streamer play it? You’re not alone. Streaming has become a major discovery engine in gaming. Watching creators try new releases, revisit cult classics, or spotlight lesser-known indies helps players find titles they might never encounter on their own. Sometimes, entire genres or games blow up because of a few well-timed streams.Together, these draws have sparked a whole new kind of culture – gaming communities with their own languages, celebrities, and shared rituals.Inside Streaming CultureStreaming has created something unique in gaming: genuine relationships between creators and audiences who've never met. When Asmongold reacts to the latest releases or penguinz0 delivers his signature deadpan commentary, millions of viewers don't just watch – they feel like they're hanging out with a friend. These streamers have become trusted voices whose opinions carry real weight, making gaming fame more accessible than ever. Anyone with personality and dedication can build a loyal following and become a cultural influencer.If you've ever watched a Twitch stream, you've witnessed chat culture in action – a chaotic river of emotes, inside jokes, and reactions that somehow make perfect sense to regulars. "KEKW" expresses laughter, "Poggers" shows excitement, and memes spread like wildfire across communities. The chat itself becomes entertainment, with viewers competing to land the perfect reaction at just the right moment. These expressions often escape their stream origins, becoming part of the broader gaming vocabulary.For many viewers, streams have become part of their daily routine – tuning in at the same time, celebrating milestones, or witnessing historic gaming moments together. When a streamer finally beats that impossible boss, the entire community shares in the victory. These aren't just individual entertainment experiences — they're collective memories where thousands can say "I was there when it happened," creating communities that extend far beyond gaming itself.How Streamers Are Reshaping the Gaming IndustryWhile players tune in for fun and connection, behind the scenes, streaming is quietly reshaping how the gaming industry approaches everything from marketing to game design. What started as casual gameplay broadcasts is now influencing major decisions across studios and publishers.The New Marketing Powerhouse. Traditional game reviews and advertising have taken a backseat to streamer influence. A single popular creator playing your game can generate millions of views and drive massive sales overnight – just look at how Among Us exploded after a few key streamers discovered it, or how Fall Guys became a phenomenon through streaming momentum. Publishers now prioritize getting their games into the hands of influential streamers on launch day, knowing that authentic gameplay footage and reactions carry more weight than any trailer or review. Day-one streaming success has become make-or-break for many titles.Designing for the Stream. Developers are now creating games with streaming in mind. Modern titles include built-in streaming tools, spectator-friendly interfaces, and features that encourage viewer interaction like chat integration and voting systems. Games are designed to be visually clear and exciting to watch, not just play. Some developers even create "streamer modes" that remove copyrighted music or add special features for streamers. The rise of streaming has birthed entirely new genres — party games, reaction-heavy horror titles, and social deduction games all thrive because they're inherently entertaining to watch.The Creator Economy Boom. Streaming has created entirely new career paths and revenue streams within gaming. Successful streamers earn through donations, subscriptions, brand partnerships, and revenue sharing from platform-specific features like Twitch bits or YouTube Super Chat. This has spawned a massive creator economy where top streamers command six-figure sponsorship deals, while publishers allocate significant budgets to influencer partnerships rather than traditional advertising. The rise of streaming has also fueled the growth of esports, where pro players double as entertainers – drawing massive online audiences and blurring the line between competition and content.Video Game Streaming in NumbersWhile it’s easy to feel the impact of streaming in daily gaming life, the numbers behind the trend tell an even more powerful story. From billions in revenue to global shifts in viewer behavior, game streaming has grown into a massive industry reshaping how we play, watch, and connect. Here’s a look at the data driving the movement.Market Size & GrowthIn 2025, the global Games Live Streaming market is projected to generate billion in revenue. By 2030, that figure is expected to reach billion, growing at an annual rate of 4.32%.The average revenue per userin 2025 stands at showing consistent monetization across platforms.China remains the single largest market, expected to bring in billion this year alone.
    #gaming #meets #streaming #inside #shift
    Gaming Meets Streaming: Inside the Shift
    After a long, busy day, you boot up your gaming device but don’t quite feel like diving into an intense session. Instead, you open a broadcast of one of your favorite streamers and spend the evening laughing at commentary, reacting to unexpected moments, and just enjoying your time with fellow gamers. Sounds familiar?This everyday scenario perfectly captures the way live streaming platforms like Twitch, YouTube Gaming, or Kick have transformed the gaming experience — turning gameplay into shared moments where gamers broadcast in real-time while viewers watch, chat, learn, and discover new titles.What started as friends sharing gameplay clips has exploded into a multi-billion-dollar ecosystem where streamers are popular creators, viewers build communities around shared experiences, and watching games has become as popular as playing them. But how did streaming become such a powerful force in gaming – and what does it mean for players, creators, and the industry alike? Let’s find out!Why Do Gamers Love Streaming?So why are millions of gamers spending hours every week watching others play instead of jumping into a game themselves? The answer isn’t just one thing – it’s a mix of entertainment, learning, connection, and discovery that makes live streaming uniquely compelling. Let’s break it down.Entertainment at Your Own PaceSometimes, you just want to relax. Maybe you’re too mentally drained to queue up for ranked matches or start that complex RPG quest. Streaming offers the perfect low-effort alternative – the fun of gaming without needing to press a single button. Whether it's high-stakes gameplay, hilarious commentary, or unpredictable in-game chaos, streams let you enjoy all the excitement while kicking back on the couch, grabbing a snack, or chatting in the background.Learning and Skill DevelopmentStreaming isn’t just for laughs – it’s also one of the best ways to level up your own gameplay. Watching a skilled streamer handle a tricky boss fight, execute high-level strategies, or master a game’s mechanics can teach you far more than a dry tutorial ever could. Many gamers tune in specifically to study routes, tactics, builds, or even to understand if a game suits their playstyle before buying it. Think of it as education, but way more fun.Social Connection and CommunityOne of the most powerful draws of live streaming is the sense of community. Jumping into a stream isn’t like watching TV – it’s like entering a room full of people who love the same games you do. Chatting with fellow viewers, sharing reactions in real-time, tossing emotes into the chaos, and getting shoutouts from the streamer – it all creates a sense of belonging. For many, it’s a go-to social space where friendships, inside jokes, and even fandoms grow.Discovery of New Games and TrendsEver found a game you now love just because you saw a streamer play it? You’re not alone. Streaming has become a major discovery engine in gaming. Watching creators try new releases, revisit cult classics, or spotlight lesser-known indies helps players find titles they might never encounter on their own. Sometimes, entire genres or games blow up because of a few well-timed streams.Together, these draws have sparked a whole new kind of culture – gaming communities with their own languages, celebrities, and shared rituals.Inside Streaming CultureStreaming has created something unique in gaming: genuine relationships between creators and audiences who've never met. When Asmongold reacts to the latest releases or penguinz0 delivers his signature deadpan commentary, millions of viewers don't just watch – they feel like they're hanging out with a friend. These streamers have become trusted voices whose opinions carry real weight, making gaming fame more accessible than ever. Anyone with personality and dedication can build a loyal following and become a cultural influencer.If you've ever watched a Twitch stream, you've witnessed chat culture in action – a chaotic river of emotes, inside jokes, and reactions that somehow make perfect sense to regulars. "KEKW" expresses laughter, "Poggers" shows excitement, and memes spread like wildfire across communities. The chat itself becomes entertainment, with viewers competing to land the perfect reaction at just the right moment. These expressions often escape their stream origins, becoming part of the broader gaming vocabulary.For many viewers, streams have become part of their daily routine – tuning in at the same time, celebrating milestones, or witnessing historic gaming moments together. When a streamer finally beats that impossible boss, the entire community shares in the victory. These aren't just individual entertainment experiences — they're collective memories where thousands can say "I was there when it happened," creating communities that extend far beyond gaming itself.How Streamers Are Reshaping the Gaming IndustryWhile players tune in for fun and connection, behind the scenes, streaming is quietly reshaping how the gaming industry approaches everything from marketing to game design. What started as casual gameplay broadcasts is now influencing major decisions across studios and publishers.The New Marketing Powerhouse. Traditional game reviews and advertising have taken a backseat to streamer influence. A single popular creator playing your game can generate millions of views and drive massive sales overnight – just look at how Among Us exploded after a few key streamers discovered it, or how Fall Guys became a phenomenon through streaming momentum. Publishers now prioritize getting their games into the hands of influential streamers on launch day, knowing that authentic gameplay footage and reactions carry more weight than any trailer or review. Day-one streaming success has become make-or-break for many titles.Designing for the Stream. Developers are now creating games with streaming in mind. Modern titles include built-in streaming tools, spectator-friendly interfaces, and features that encourage viewer interaction like chat integration and voting systems. Games are designed to be visually clear and exciting to watch, not just play. Some developers even create "streamer modes" that remove copyrighted music or add special features for streamers. The rise of streaming has birthed entirely new genres — party games, reaction-heavy horror titles, and social deduction games all thrive because they're inherently entertaining to watch.The Creator Economy Boom. Streaming has created entirely new career paths and revenue streams within gaming. Successful streamers earn through donations, subscriptions, brand partnerships, and revenue sharing from platform-specific features like Twitch bits or YouTube Super Chat. This has spawned a massive creator economy where top streamers command six-figure sponsorship deals, while publishers allocate significant budgets to influencer partnerships rather than traditional advertising. The rise of streaming has also fueled the growth of esports, where pro players double as entertainers – drawing massive online audiences and blurring the line between competition and content.Video Game Streaming in NumbersWhile it’s easy to feel the impact of streaming in daily gaming life, the numbers behind the trend tell an even more powerful story. From billions in revenue to global shifts in viewer behavior, game streaming has grown into a massive industry reshaping how we play, watch, and connect. Here’s a look at the data driving the movement.Market Size & GrowthIn 2025, the global Games Live Streaming market is projected to generate billion in revenue. By 2030, that figure is expected to reach billion, growing at an annual rate of 4.32%.The average revenue per userin 2025 stands at showing consistent monetization across platforms.China remains the single largest market, expected to bring in billion this year alone. #gaming #meets #streaming #inside #shift
    Gaming Meets Streaming: Inside the Shift
    80.lv
    After a long, busy day, you boot up your gaming device but don’t quite feel like diving into an intense session. Instead, you open a broadcast of one of your favorite streamers and spend the evening laughing at commentary, reacting to unexpected moments, and just enjoying your time with fellow gamers. Sounds familiar?This everyday scenario perfectly captures the way live streaming platforms like Twitch, YouTube Gaming, or Kick have transformed the gaming experience — turning gameplay into shared moments where gamers broadcast in real-time while viewers watch, chat, learn, and discover new titles.What started as friends sharing gameplay clips has exploded into a multi-billion-dollar ecosystem where streamers are popular creators, viewers build communities around shared experiences, and watching games has become as popular as playing them. But how did streaming become such a powerful force in gaming – and what does it mean for players, creators, and the industry alike? Let’s find out!Why Do Gamers Love Streaming?So why are millions of gamers spending hours every week watching others play instead of jumping into a game themselves? The answer isn’t just one thing – it’s a mix of entertainment, learning, connection, and discovery that makes live streaming uniquely compelling. Let’s break it down.Entertainment at Your Own PaceSometimes, you just want to relax. Maybe you’re too mentally drained to queue up for ranked matches or start that complex RPG quest. Streaming offers the perfect low-effort alternative – the fun of gaming without needing to press a single button. Whether it's high-stakes gameplay, hilarious commentary, or unpredictable in-game chaos, streams let you enjoy all the excitement while kicking back on the couch, grabbing a snack, or chatting in the background.Learning and Skill DevelopmentStreaming isn’t just for laughs – it’s also one of the best ways to level up your own gameplay. Watching a skilled streamer handle a tricky boss fight, execute high-level strategies, or master a game’s mechanics can teach you far more than a dry tutorial ever could. Many gamers tune in specifically to study routes, tactics, builds, or even to understand if a game suits their playstyle before buying it. Think of it as education, but way more fun.Social Connection and CommunityOne of the most powerful draws of live streaming is the sense of community. Jumping into a stream isn’t like watching TV – it’s like entering a room full of people who love the same games you do. Chatting with fellow viewers, sharing reactions in real-time, tossing emotes into the chaos, and getting shoutouts from the streamer – it all creates a sense of belonging. For many, it’s a go-to social space where friendships, inside jokes, and even fandoms grow.Discovery of New Games and TrendsEver found a game you now love just because you saw a streamer play it? You’re not alone. Streaming has become a major discovery engine in gaming. Watching creators try new releases, revisit cult classics, or spotlight lesser-known indies helps players find titles they might never encounter on their own. Sometimes, entire genres or games blow up because of a few well-timed streams (Among Us, Vampire Survivors, Only Up! – all made big by streamers).Together, these draws have sparked a whole new kind of culture – gaming communities with their own languages, celebrities, and shared rituals.Inside Streaming CultureStreaming has created something unique in gaming: genuine relationships between creators and audiences who've never met. When Asmongold reacts to the latest releases or penguinz0 delivers his signature deadpan commentary, millions of viewers don't just watch – they feel like they're hanging out with a friend. These streamers have become trusted voices whose opinions carry real weight, making gaming fame more accessible than ever. Anyone with personality and dedication can build a loyal following and become a cultural influencer.If you've ever watched a Twitch stream, you've witnessed chat culture in action – a chaotic river of emotes, inside jokes, and reactions that somehow make perfect sense to regulars. "KEKW" expresses laughter, "Poggers" shows excitement, and memes spread like wildfire across communities. The chat itself becomes entertainment, with viewers competing to land the perfect reaction at just the right moment. These expressions often escape their stream origins, becoming part of the broader gaming vocabulary.For many viewers, streams have become part of their daily routine – tuning in at the same time, celebrating milestones, or witnessing historic gaming moments together. When a streamer finally beats that impossible boss, the entire community shares in the victory. These aren't just individual entertainment experiences — they're collective memories where thousands can say "I was there when it happened," creating communities that extend far beyond gaming itself.How Streamers Are Reshaping the Gaming IndustryWhile players tune in for fun and connection, behind the scenes, streaming is quietly reshaping how the gaming industry approaches everything from marketing to game design. What started as casual gameplay broadcasts is now influencing major decisions across studios and publishers.The New Marketing Powerhouse. Traditional game reviews and advertising have taken a backseat to streamer influence. A single popular creator playing your game can generate millions of views and drive massive sales overnight – just look at how Among Us exploded after a few key streamers discovered it, or how Fall Guys became a phenomenon through streaming momentum. Publishers now prioritize getting their games into the hands of influential streamers on launch day, knowing that authentic gameplay footage and reactions carry more weight than any trailer or review. Day-one streaming success has become make-or-break for many titles.Designing for the Stream. Developers are now creating games with streaming in mind. Modern titles include built-in streaming tools, spectator-friendly interfaces, and features that encourage viewer interaction like chat integration and voting systems. Games are designed to be visually clear and exciting to watch, not just play. Some developers even create "streamer modes" that remove copyrighted music or add special features for streamers. The rise of streaming has birthed entirely new genres — party games, reaction-heavy horror titles, and social deduction games all thrive because they're inherently entertaining to watch.The Creator Economy Boom. Streaming has created entirely new career paths and revenue streams within gaming. Successful streamers earn through donations, subscriptions, brand partnerships, and revenue sharing from platform-specific features like Twitch bits or YouTube Super Chat. This has spawned a massive creator economy where top streamers command six-figure sponsorship deals, while publishers allocate significant budgets to influencer partnerships rather than traditional advertising. The rise of streaming has also fueled the growth of esports, where pro players double as entertainers – drawing massive online audiences and blurring the line between competition and content.Video Game Streaming in NumbersWhile it’s easy to feel the impact of streaming in daily gaming life, the numbers behind the trend tell an even more powerful story. From billions in revenue to global shifts in viewer behavior, game streaming has grown into a massive industry reshaping how we play, watch, and connect. Here’s a look at the data driving the movement.Market Size & GrowthIn 2025, the global Games Live Streaming market is projected to generate $15.32 billion in revenue. By 2030, that figure is expected to reach $18.92 billion, growing at an annual rate of 4.32%.The average revenue per user (ARPU) in 2025 stands at $10.51, showing consistent monetization across platforms.China remains the single largest market, expected to bring in $2.92 billion this year alone.Source: Statista Market Insights, 2025Viewership & Daily HabitsThe number of users in the live game streaming market is forecast to hit 1.8 billion by 2030, with user penetration rising from 18.6% in 2025 to 22.6% by the end of the decade.In 2023, average daily time spent watching game streams rose to 2.5 hours per user, up 12% year-over-year — a clear sign of streaming becoming part of gamers’ daily routines.Sources: Statista Market Insights, 2025; SNS Insider, 2024What People Are WatchingThe most-watched games on Twitch include League of Legends, GTA V, and Counter-Strike — all regularly topping charts for both viewers and streamers.When it comes to creators, the most-streamed games are Fortnite, Valorant, and Call of Duty: Warzone, showing a strong overlap between what streamers love to broadcast and what audiences enjoy watching.In Q1 2024, Twitch users spent over 249 million hours watching new game releases, while total gaming-related content reached around 3.3 billion hours.Sources: SullyGnome, 2025; Statista, 2025Global Trends & Regional PlatformsChina’s local platforms like Huya (31M MAU) and Douyu (26.6M MAU) remain key players in the domestic market.In South Korea, following Twitch’s 2023 exit, local services like AfreecaTV and newcomer Chzzk have positioned themselves as alternatives.Meanwhile, Japan and Europe continue to see steady engagement driven by strong gaming scenes and dedicated fan communities.Source: Statista, 2025Event Livestreaming Hits New HighsNintendo Direct was the most-watched gaming showcase in 2024, with an average minute audience of 2.6 million.The 2024 Streamer Awards drew over 645,000 peak viewers, highlighting how creator-focused events now rival traditional game showcases.Source: Statista, 2025As game streaming continues to evolve, its role in the broader gaming ecosystem is becoming clearer. It hasn’t replaced traditional gameplay – instead, it’s added a new dimension to how people engage with games, offering a space for connection, discovery, and commentary. For players, creators, and industry leaders alike, streaming now sits alongside playing as a core part of the modern gaming experience – one that continues to grow and shift with the industry itself.
    Like
    Love
    Wow
    Sad
    Angry
    615
    · 2 Comments ·0 Shares
  • Creating a Detailed Helmet Inspired by Fallout Using Substance 3D

    IntroductionHi! My name is Pavel Vorobyev, and I'm a 19-year-old 3D Artist specializing in texturing and weapon creation for video games. I've been working in the industry for about 3 years now. During this time, I've had the opportunity to contribute to several exciting projects, including Arma, DayZ, Ratten Reich, and a NEXT-GEN sci-fi shooter. Here's my ArtStation portfolio.My journey into 3D art began in my early teens, around the age of 13 or 14. At some point, I got tired of just playing games and started wondering: "How are they actually made?" That question led me to explore game development. I tried everything – level design, programming, game design – but it was 3D art that truly captured me.I'm entirely self-taught. I learned everything from YouTube, tutorials, articles, and official documentation, gathering knowledge piece by piece. Breaking into the commercial side of the industry wasn't easy: there were a lot of failures, no opportunities, and no support. At one point, I even took a job at a metallurgical plant. But I kept pushing forward, kept learning and improving my skills in 3D. Eventually, I got my first industry offer – and that's when my real path began.Today, I continue to grow, constantly experimenting with new styles, tools, and techniques. For me, 3D isn't just a profession – it's a form of self-expression and a path toward my dream. My goal is to build a strong career in the game industry and eventually move into cinematic storytelling in the spirit of Love, Death & Robots.Astartes YouTube channelI also want to inspire younger artists and show how powerful texturing can be as a creative tool. To demonstrate that, I'd love to share my personal project PU – Part 1, which reflects my passion and approach to texture art.In this article, I'll be sharing my latest personal project – a semi-realistic sci-fi helmet that I created from scratch, experimenting with both form and style. It's a personal exploration where I aimed to step away from traditional hyperrealism and bring in a touch of artistic expression.Concept & Project IdeaThe idea behind this helmet project came from a very specific goal – to design a visually appealing asset with rich texture variation and achieve a balance between stylization and realism. I wanted to create something that looked believable, yet had an artistic flair. Since I couldn't find any fitting concepts online, I started building the design from scratch in my head. I eventually settled on creating a helmet as the main focus of the project. For visual direction, I drew inspiration from post-apocalyptic themes and the gritty aesthetics of Fallout and Warhammer 40,000.Software & Tools UsedFor this project, I used Blender, ZBrush, Substance 3D Painter, Marmoset Toolbag 5, Photoshop, and RizomUV. I created the low-poly mesh in Blender and developed the concept and high-poly sculpt in ZBrush. In Substance 3D Painter, I worked on the texture concept and final texturing. Baking and rendering were done in Marmoset Toolbag, and I used Photoshop for some adjustments to the bake. UV unwrapping was handled in RizomUV.Modeling & RetopologyI began the development process by designing the concept based on my earlier references – Fallout and Warhammer 40,000. The initial blockout was done in ZBrush, and from there, I started refining the shapes and details to create something visually engaging and stylistically bold.After completing the high-poly model, I moved on to the long and challenging process of retopology. Since I originally came from a weapons-focused background, I applied the knowledge I gained from modeling firearms. I slightly increased the polycount to achieve a cleaner and more appealing look in the final render – reducing visible faceting. My goal was to strike a balance between visual quality and a game-ready asset.UV Mapping & BakingNext, I moved on to UV mapping. There's nothing too complex about this stage, but since my goal was to create a game-ready asset, I made extensive use of overlaps. I did the UVs in Rizom UV. The most important part is to align the UV shells into clean strips and unwrap cylinders properly into straight lines.Once the UVs were done, I proceeded to bake the normal and ambient occlusion maps. At this stage, the key is having clean UVs and solid retopology – if those are in place, the bake goes smoothly. Texturing: Concept & WorkflowNow we move on to the most challenging stage – texturing. I aimed to present the project in a hyperrealistic style with a touch of stylization. This turned out to be quite difficult, and I went through many iterations. The most important part of this phase was developing a solid texture concept: rough decals, color combinations, and overall material direction. Without that foundation, it makes no sense to move forward with the texturing. After a long process of trial and error, I finally arrived at results I was satisfied with.Then I followed my pipeline:1. Working on the base materials2. Storytelling and damage3. Decals4. Spraying, dust, and dirtWorking on the Base MaterialsWhen working on the base materials, the main thing is to work with the physical properties and texture. You need to extract the maximum quality from the generators before manual processing. The idea was to create the feeling of an old, heavy helmet that had lived its life and had previously been painted a different color. To make it battered and, in a sense, rotten.It is important to pay attention to noise maps – Dirt 3, Dirt 6, White Noise, Flakes – and add the feel of old metal with custom Normal Maps. I also mixed in photo textures for a special charm. PhototextureCustom Normal Map TextureStorytelling & DamageGradients play an important role in the storytelling stage. They make the object artistically dynamic and beautiful, adding individual shades that bring the helmet to life.Everything else is done manually. I found a bunch of old helmets from World War II and took alpha damage shots of them using Photoshop. I drew the damage with alphas, trying to clearly separate the material into old paint, new paint, rust, and bare metal.I did the rust using MatFX Rust from the standard Substance 3D Painter library. I drew beautiful patterns using paint in multiply mode – this quickly helped to recreate the rust effect. Metal damage and old paint were more difficult: due to the large number of overlaps in the helmet, I had to carefully draw patterns, minimizing the visibility of overlaps.DecalsI drew the decals carefully, sticking to the concept, which added richness to the texture.Spray Paint & DirtFor spray paint and dirt, I used a long-established weapon template consisting of dust particles, sand particles, and spray paint. I analyzed references and applied them to crevices and logical places where dirt could accumulate.Rendering & Post-ProcessingI rendered in Marmoset Toolbag 5 using a new rendering format that I developed together with the team. The essence of the method is to simulate "RAW frames." Since Marmoset does not have such functions, I worked with the EXR 32-BIT format, which significantly improves the quality of the render: the shadows are smooth, without artifacts and broken gradients. I assembled the scene using Quixel Megascans. After rendering, I did post-processing in Photoshop utilizing Filter Camera Raw. Conclusion & Advice for BeginnersThat's all. For beginners or those who have been unsuccessful in the industry for a long time, I advise you to follow your dream and not listen to anyone else. Success is a matter of time and skill! Talent is not something you are born with; it is something you develop. Work on yourself and your work, put your heart into it, and you will succeed!Pavel Vorobiev, Texture ArtistInterview conducted by Gloria Levine
    #creating #detailed #helmet #inspired #fallout
    Creating a Detailed Helmet Inspired by Fallout Using Substance 3D
    IntroductionHi! My name is Pavel Vorobyev, and I'm a 19-year-old 3D Artist specializing in texturing and weapon creation for video games. I've been working in the industry for about 3 years now. During this time, I've had the opportunity to contribute to several exciting projects, including Arma, DayZ, Ratten Reich, and a NEXT-GEN sci-fi shooter. Here's my ArtStation portfolio.My journey into 3D art began in my early teens, around the age of 13 or 14. At some point, I got tired of just playing games and started wondering: "How are they actually made?" That question led me to explore game development. I tried everything – level design, programming, game design – but it was 3D art that truly captured me.I'm entirely self-taught. I learned everything from YouTube, tutorials, articles, and official documentation, gathering knowledge piece by piece. Breaking into the commercial side of the industry wasn't easy: there were a lot of failures, no opportunities, and no support. At one point, I even took a job at a metallurgical plant. But I kept pushing forward, kept learning and improving my skills in 3D. Eventually, I got my first industry offer – and that's when my real path began.Today, I continue to grow, constantly experimenting with new styles, tools, and techniques. For me, 3D isn't just a profession – it's a form of self-expression and a path toward my dream. My goal is to build a strong career in the game industry and eventually move into cinematic storytelling in the spirit of Love, Death & Robots.Astartes YouTube channelI also want to inspire younger artists and show how powerful texturing can be as a creative tool. To demonstrate that, I'd love to share my personal project PU – Part 1, which reflects my passion and approach to texture art.In this article, I'll be sharing my latest personal project – a semi-realistic sci-fi helmet that I created from scratch, experimenting with both form and style. It's a personal exploration where I aimed to step away from traditional hyperrealism and bring in a touch of artistic expression.Concept & Project IdeaThe idea behind this helmet project came from a very specific goal – to design a visually appealing asset with rich texture variation and achieve a balance between stylization and realism. I wanted to create something that looked believable, yet had an artistic flair. Since I couldn't find any fitting concepts online, I started building the design from scratch in my head. I eventually settled on creating a helmet as the main focus of the project. For visual direction, I drew inspiration from post-apocalyptic themes and the gritty aesthetics of Fallout and Warhammer 40,000.Software & Tools UsedFor this project, I used Blender, ZBrush, Substance 3D Painter, Marmoset Toolbag 5, Photoshop, and RizomUV. I created the low-poly mesh in Blender and developed the concept and high-poly sculpt in ZBrush. In Substance 3D Painter, I worked on the texture concept and final texturing. Baking and rendering were done in Marmoset Toolbag, and I used Photoshop for some adjustments to the bake. UV unwrapping was handled in RizomUV.Modeling & RetopologyI began the development process by designing the concept based on my earlier references – Fallout and Warhammer 40,000. The initial blockout was done in ZBrush, and from there, I started refining the shapes and details to create something visually engaging and stylistically bold.After completing the high-poly model, I moved on to the long and challenging process of retopology. Since I originally came from a weapons-focused background, I applied the knowledge I gained from modeling firearms. I slightly increased the polycount to achieve a cleaner and more appealing look in the final render – reducing visible faceting. My goal was to strike a balance between visual quality and a game-ready asset.UV Mapping & BakingNext, I moved on to UV mapping. There's nothing too complex about this stage, but since my goal was to create a game-ready asset, I made extensive use of overlaps. I did the UVs in Rizom UV. The most important part is to align the UV shells into clean strips and unwrap cylinders properly into straight lines.Once the UVs were done, I proceeded to bake the normal and ambient occlusion maps. At this stage, the key is having clean UVs and solid retopology – if those are in place, the bake goes smoothly. Texturing: Concept & WorkflowNow we move on to the most challenging stage – texturing. I aimed to present the project in a hyperrealistic style with a touch of stylization. This turned out to be quite difficult, and I went through many iterations. The most important part of this phase was developing a solid texture concept: rough decals, color combinations, and overall material direction. Without that foundation, it makes no sense to move forward with the texturing. After a long process of trial and error, I finally arrived at results I was satisfied with.Then I followed my pipeline:1. Working on the base materials2. Storytelling and damage3. Decals4. Spraying, dust, and dirtWorking on the Base MaterialsWhen working on the base materials, the main thing is to work with the physical properties and texture. You need to extract the maximum quality from the generators before manual processing. The idea was to create the feeling of an old, heavy helmet that had lived its life and had previously been painted a different color. To make it battered and, in a sense, rotten.It is important to pay attention to noise maps – Dirt 3, Dirt 6, White Noise, Flakes – and add the feel of old metal with custom Normal Maps. I also mixed in photo textures for a special charm. PhototextureCustom Normal Map TextureStorytelling & DamageGradients play an important role in the storytelling stage. They make the object artistically dynamic and beautiful, adding individual shades that bring the helmet to life.Everything else is done manually. I found a bunch of old helmets from World War II and took alpha damage shots of them using Photoshop. I drew the damage with alphas, trying to clearly separate the material into old paint, new paint, rust, and bare metal.I did the rust using MatFX Rust from the standard Substance 3D Painter library. I drew beautiful patterns using paint in multiply mode – this quickly helped to recreate the rust effect. Metal damage and old paint were more difficult: due to the large number of overlaps in the helmet, I had to carefully draw patterns, minimizing the visibility of overlaps.DecalsI drew the decals carefully, sticking to the concept, which added richness to the texture.Spray Paint & DirtFor spray paint and dirt, I used a long-established weapon template consisting of dust particles, sand particles, and spray paint. I analyzed references and applied them to crevices and logical places where dirt could accumulate.Rendering & Post-ProcessingI rendered in Marmoset Toolbag 5 using a new rendering format that I developed together with the team. The essence of the method is to simulate "RAW frames." Since Marmoset does not have such functions, I worked with the EXR 32-BIT format, which significantly improves the quality of the render: the shadows are smooth, without artifacts and broken gradients. I assembled the scene using Quixel Megascans. After rendering, I did post-processing in Photoshop utilizing Filter Camera Raw. Conclusion & Advice for BeginnersThat's all. For beginners or those who have been unsuccessful in the industry for a long time, I advise you to follow your dream and not listen to anyone else. Success is a matter of time and skill! Talent is not something you are born with; it is something you develop. Work on yourself and your work, put your heart into it, and you will succeed!Pavel Vorobiev, Texture ArtistInterview conducted by Gloria Levine #creating #detailed #helmet #inspired #fallout
    Creating a Detailed Helmet Inspired by Fallout Using Substance 3D
    80.lv
    IntroductionHi! My name is Pavel Vorobyev, and I'm a 19-year-old 3D Artist specializing in texturing and weapon creation for video games. I've been working in the industry for about 3 years now. During this time, I've had the opportunity to contribute to several exciting projects, including Arma, DayZ, Ratten Reich, and a NEXT-GEN sci-fi shooter (currently under NDA). Here's my ArtStation portfolio.My journey into 3D art began in my early teens, around the age of 13 or 14. At some point, I got tired of just playing games and started wondering: "How are they actually made?" That question led me to explore game development. I tried everything – level design, programming, game design – but it was 3D art that truly captured me.I'm entirely self-taught. I learned everything from YouTube, tutorials, articles, and official documentation, gathering knowledge piece by piece. Breaking into the commercial side of the industry wasn't easy: there were a lot of failures, no opportunities, and no support. At one point, I even took a job at a metallurgical plant. But I kept pushing forward, kept learning and improving my skills in 3D. Eventually, I got my first industry offer – and that's when my real path began.Today, I continue to grow, constantly experimenting with new styles, tools, and techniques. For me, 3D isn't just a profession – it's a form of self-expression and a path toward my dream. My goal is to build a strong career in the game industry and eventually move into cinematic storytelling in the spirit of Love, Death & Robots.Astartes YouTube channelI also want to inspire younger artists and show how powerful texturing can be as a creative tool. To demonstrate that, I'd love to share my personal project PU – Part 1, which reflects my passion and approach to texture art.In this article, I'll be sharing my latest personal project – a semi-realistic sci-fi helmet that I created from scratch, experimenting with both form and style. It's a personal exploration where I aimed to step away from traditional hyperrealism and bring in a touch of artistic expression.Concept & Project IdeaThe idea behind this helmet project came from a very specific goal – to design a visually appealing asset with rich texture variation and achieve a balance between stylization and realism. I wanted to create something that looked believable, yet had an artistic flair. Since I couldn't find any fitting concepts online, I started building the design from scratch in my head. I eventually settled on creating a helmet as the main focus of the project. For visual direction, I drew inspiration from post-apocalyptic themes and the gritty aesthetics of Fallout and Warhammer 40,000.Software & Tools UsedFor this project, I used Blender, ZBrush, Substance 3D Painter, Marmoset Toolbag 5, Photoshop, and RizomUV. I created the low-poly mesh in Blender and developed the concept and high-poly sculpt in ZBrush. In Substance 3D Painter, I worked on the texture concept and final texturing. Baking and rendering were done in Marmoset Toolbag, and I used Photoshop for some adjustments to the bake. UV unwrapping was handled in RizomUV.Modeling & RetopologyI began the development process by designing the concept based on my earlier references – Fallout and Warhammer 40,000. The initial blockout was done in ZBrush, and from there, I started refining the shapes and details to create something visually engaging and stylistically bold.After completing the high-poly model, I moved on to the long and challenging process of retopology. Since I originally came from a weapons-focused background, I applied the knowledge I gained from modeling firearms. I slightly increased the polycount to achieve a cleaner and more appealing look in the final render – reducing visible faceting. My goal was to strike a balance between visual quality and a game-ready asset.UV Mapping & BakingNext, I moved on to UV mapping. There's nothing too complex about this stage, but since my goal was to create a game-ready asset, I made extensive use of overlaps. I did the UVs in Rizom UV. The most important part is to align the UV shells into clean strips and unwrap cylinders properly into straight lines.Once the UVs were done, I proceeded to bake the normal and ambient occlusion maps. At this stage, the key is having clean UVs and solid retopology – if those are in place, the bake goes smoothly. Texturing: Concept & WorkflowNow we move on to the most challenging stage – texturing. I aimed to present the project in a hyperrealistic style with a touch of stylization. This turned out to be quite difficult, and I went through many iterations. The most important part of this phase was developing a solid texture concept: rough decals, color combinations, and overall material direction. Without that foundation, it makes no sense to move forward with the texturing. After a long process of trial and error, I finally arrived at results I was satisfied with.Then I followed my pipeline:1. Working on the base materials2. Storytelling and damage3. Decals4. Spraying, dust, and dirtWorking on the Base MaterialsWhen working on the base materials, the main thing is to work with the physical properties and texture. You need to extract the maximum quality from the generators before manual processing. The idea was to create the feeling of an old, heavy helmet that had lived its life and had previously been painted a different color. To make it battered and, in a sense, rotten.It is important to pay attention to noise maps – Dirt 3, Dirt 6, White Noise, Flakes – and add the feel of old metal with custom Normal Maps. I also mixed in photo textures for a special charm. PhototextureCustom Normal Map TextureStorytelling & DamageGradients play an important role in the storytelling stage. They make the object artistically dynamic and beautiful, adding individual shades that bring the helmet to life.Everything else is done manually. I found a bunch of old helmets from World War II and took alpha damage shots of them using Photoshop. I drew the damage with alphas, trying to clearly separate the material into old paint, new paint, rust, and bare metal.I did the rust using MatFX Rust from the standard Substance 3D Painter library. I drew beautiful patterns using paint in multiply mode – this quickly helped to recreate the rust effect. Metal damage and old paint were more difficult: due to the large number of overlaps in the helmet, I had to carefully draw patterns, minimizing the visibility of overlaps.DecalsI drew the decals carefully, sticking to the concept, which added richness to the texture.Spray Paint & DirtFor spray paint and dirt, I used a long-established weapon template consisting of dust particles, sand particles, and spray paint. I analyzed references and applied them to crevices and logical places where dirt could accumulate.Rendering & Post-ProcessingI rendered in Marmoset Toolbag 5 using a new rendering format that I developed together with the team. The essence of the method is to simulate "RAW frames." Since Marmoset does not have such functions, I worked with the EXR 32-BIT format, which significantly improves the quality of the render: the shadows are smooth, without artifacts and broken gradients. I assembled the scene using Quixel Megascans. After rendering, I did post-processing in Photoshop utilizing Filter Camera Raw. Conclusion & Advice for BeginnersThat's all. For beginners or those who have been unsuccessful in the industry for a long time, I advise you to follow your dream and not listen to anyone else. Success is a matter of time and skill! Talent is not something you are born with; it is something you develop. Work on yourself and your work, put your heart into it, and you will succeed!Pavel Vorobiev, Texture ArtistInterview conducted by Gloria Levine
    Like
    Love
    Wow
    Sad
    Angry
    701
    · 2 Comments ·0 Shares
  • تعرفوا على غرداية بمعالمها الرائعة؟ هذه المدينة اللي معروفة بالبعير والعرس الجماعي! في الفيديو الجديد على قناة الشروق نيوز، نغوصوا مع بعض في عادات أهل حي الثنية، وكيفاش يحيوا المناسبات بطريقة مميزة توصل لك روح الضيافة والتقاليد العريقة.

    شخصياً، شفت كيفاش العرس الجماعي يخلق أجواء فرح وسعادة، كل الأعمار تتجمع وتحتفل كأنهم عائلة واحدة. شكون فيكم شارك في عرس من هالنوع؟ وكيف كانت التجربة؟

    ما تنسوش تشوفوا الفيديو، لأنه رح يفتح لكم عيونكم على عادات وتقاليد غرداية اللي مازالوا صامدين رغم الزمن.

    https://www.youtube.com/watch?v=ecSTCG237nI
    #غرداية #العرس_الجماعي #Traditions #EchoroukNews #Célébration
    👀 تعرفوا على غرداية بمعالمها الرائعة؟ هذه المدينة اللي معروفة بالبعير والعرس الجماعي! 🎉 في الفيديو الجديد على قناة الشروق نيوز، نغوصوا مع بعض في عادات أهل حي الثنية، وكيفاش يحيوا المناسبات بطريقة مميزة توصل لك روح الضيافة والتقاليد العريقة. شخصياً، شفت كيفاش العرس الجماعي يخلق أجواء فرح وسعادة، كل الأعمار تتجمع وتحتفل كأنهم عائلة واحدة. 😍 شكون فيكم شارك في عرس من هالنوع؟ وكيف كانت التجربة؟ ما تنسوش تشوفوا الفيديو، لأنه رح يفتح لكم عيونكم على عادات وتقاليد غرداية اللي مازالوا صامدين رغم الزمن. https://www.youtube.com/watch?v=ecSTCG237nI #غرداية #العرس_الجماعي #Traditions #EchoroukNews #Célébration
    Like
    Love
    Wow
    Angry
    Sad
    230
    · 1 Comments ·0 Shares
  • Inside The Blood of Dawnwalker’s narrative sandbox and dual gameplay

    I’m Mateusz Tomaszkiewicz, Creative Director at Rebel Wolves, working on The Blood of Dawnwalker — a story-driven, open-world RPG set in 14th-century Europe. We recently showcased a closer look at the gameplay at gamescom and wanted to share some of the exciting new details with you.

    Entering the narrative sandbox

    You play as Coen, the game’s protagonist and the titular Dawnwalker. While the exact circumstances remain a mystery, an attempt to turn Coen into a vampire fails. As a result, he exists between two worlds — human during the day, vampire at night. This duality sits at the heart of the gameplay, with each form offering distinct skills and abilities. Most quests can be approached during either day or night, creating significantly different experiences.

    One of the key mechanics is what we call the “narrative sandbox.” Once the prologue concludes, you have 30 days and nights to rescue Coen’s family from Brencis, a centuries-old vampire and former Roman senator, and his inner circle. Only major actions move the clock forward, and you’re always informed how much time an activity will consume. Roaming the open world does not advance time, giving you the freedom to explore without pressure. Time works more like a currency than a countdown.

    What truly defines the narrative sandbox is the freedom it gives you to shape the story on your own terms. Quests can be completed in any order, skipped entirely, or even never discovered at all depending on your choices. Many characters can be killed, with their absence reshaping events and relationships. There are often multiple paths to achieve the same goal, and even inaction is a choice — one the world around Coen will recognize and respond to. The result is a deeply reactive narrative structure that encourages experimentation and makes every playthrough unique.

    Human by day. Vampire by night.

    The newly revealed quest takes place on day eight of Coen’s journey. By then, several quests have been completed and Coen is visibly more powerful. We first see him at night, navigating the capital city of Svartrau and using vampiric abilities like Shadowstep — a short-range teleport that lets him instantly reposition. It’s invaluable not only in combat, allowing him to flank enemies or close the gap on patrolling guards, but also for exploration and stealth.

    With Shadowstep, Coen can reach scaffolding high on the cathedral’s walls, leap between rooftops, or slip into otherwise inaccessible balconies and ledges, opening up new routes and opportunities to approach objectives. Combat remains fluid and dynamic, blending physical strength with supernatural powers. His vampire form isn’t overpowered, but it adds a distinct tactical layer.

    The quest involves infiltrating the city’s cathedral, where Coen encounters Xanthe — an ancient Greek vampire and Brencis’ most powerful ally.

    To show how time of day affects gameplay, we then reload a save to experience a daytime version of the quest. This time, the goal is to locate the legendary sword of Saint Mihai, the cathedral’s patron. In his human form, Coen leans more on swordplay and dark human magic, unavailable as a vampire. Combat is fast and responsive, with directional attacks and flexible blocking. You can block easily with a single button or use directional input for more precision and control, accommodating both story-focused players and those seeking a challenge.

    Fighting the living, the dead, and everything In between

    Once the enemies are defeated, Coen enters Svartrau during the day. The streets are bustling with life — townsfolk fill the squares, merchants trade goods, and ambient conversations hint at the uneasy coexistence under vampire rule. After roaming the vibrant streets, Coen goes to the cathedral.

    Inside, we witness a chilling ritual called the Blood Baptism, one of Brencis’ ways of twisting existing traditions to maintain control.

    After a tense dialogue sequence that nearly exposes Coen, the sword quest resumes. This leads to a battle with Muron, a creature born from a failed vampire transformation. Unlike Coen, Muron does not become a Dawnwalker but a wild, unstable monster with unpredictable powers.

    Another hex in Coen’s arsenal – Compel Soul – allows Coen to speak with the dead, helping him uncover clues and eventually locate a hidden crypt believed to hold the sword. What he finds is far more disturbing: Saint Mihai, once revered, had been entombed alive after villagers discovered he was also a Dawnwalker. Starved during the day and regenerating at night, Mihai slowly lost his sanity. When freed, he attacks Coen using the full range of Dawnwalker abilities.

    If you find him at night instead, Mihai appears in his vampire form, offering an entirely different encounter.

    The nearly 50-minute demo offers an extended look at The Blood of Dawnwalker’s design — from its dual gameplay loop and time-based structure to its focus on player agency and narrative depth. There is no single path through this story, and every decision, including inaction, shapes the journey. Slated for release in 2026, The Blood of Dawnwalker aims to deliver rich storytelling, immersive combat, and an open world where narrative truly takes the lead.

    The Blood of Dawnwalker is coming to PlayStation 5 in 2026; wishlist it now to stay updated and be among the first to step into Coen’s journey!
    #inside #blood #dawnwalkers #narrative #sandbox
    Inside The Blood of Dawnwalker’s narrative sandbox and dual gameplay
    I’m Mateusz Tomaszkiewicz, Creative Director at Rebel Wolves, working on The Blood of Dawnwalker — a story-driven, open-world RPG set in 14th-century Europe. We recently showcased a closer look at the gameplay at gamescom and wanted to share some of the exciting new details with you. Entering the narrative sandbox You play as Coen, the game’s protagonist and the titular Dawnwalker. While the exact circumstances remain a mystery, an attempt to turn Coen into a vampire fails. As a result, he exists between two worlds — human during the day, vampire at night. This duality sits at the heart of the gameplay, with each form offering distinct skills and abilities. Most quests can be approached during either day or night, creating significantly different experiences. One of the key mechanics is what we call the “narrative sandbox.” Once the prologue concludes, you have 30 days and nights to rescue Coen’s family from Brencis, a centuries-old vampire and former Roman senator, and his inner circle. Only major actions move the clock forward, and you’re always informed how much time an activity will consume. Roaming the open world does not advance time, giving you the freedom to explore without pressure. Time works more like a currency than a countdown. What truly defines the narrative sandbox is the freedom it gives you to shape the story on your own terms. Quests can be completed in any order, skipped entirely, or even never discovered at all depending on your choices. Many characters can be killed, with their absence reshaping events and relationships. There are often multiple paths to achieve the same goal, and even inaction is a choice — one the world around Coen will recognize and respond to. The result is a deeply reactive narrative structure that encourages experimentation and makes every playthrough unique. Human by day. Vampire by night. The newly revealed quest takes place on day eight of Coen’s journey. By then, several quests have been completed and Coen is visibly more powerful. We first see him at night, navigating the capital city of Svartrau and using vampiric abilities like Shadowstep — a short-range teleport that lets him instantly reposition. It’s invaluable not only in combat, allowing him to flank enemies or close the gap on patrolling guards, but also for exploration and stealth. With Shadowstep, Coen can reach scaffolding high on the cathedral’s walls, leap between rooftops, or slip into otherwise inaccessible balconies and ledges, opening up new routes and opportunities to approach objectives. Combat remains fluid and dynamic, blending physical strength with supernatural powers. His vampire form isn’t overpowered, but it adds a distinct tactical layer. The quest involves infiltrating the city’s cathedral, where Coen encounters Xanthe — an ancient Greek vampire and Brencis’ most powerful ally. To show how time of day affects gameplay, we then reload a save to experience a daytime version of the quest. This time, the goal is to locate the legendary sword of Saint Mihai, the cathedral’s patron. In his human form, Coen leans more on swordplay and dark human magic, unavailable as a vampire. Combat is fast and responsive, with directional attacks and flexible blocking. You can block easily with a single button or use directional input for more precision and control, accommodating both story-focused players and those seeking a challenge. Fighting the living, the dead, and everything In between Once the enemies are defeated, Coen enters Svartrau during the day. The streets are bustling with life — townsfolk fill the squares, merchants trade goods, and ambient conversations hint at the uneasy coexistence under vampire rule. After roaming the vibrant streets, Coen goes to the cathedral. Inside, we witness a chilling ritual called the Blood Baptism, one of Brencis’ ways of twisting existing traditions to maintain control. After a tense dialogue sequence that nearly exposes Coen, the sword quest resumes. This leads to a battle with Muron, a creature born from a failed vampire transformation. Unlike Coen, Muron does not become a Dawnwalker but a wild, unstable monster with unpredictable powers. Another hex in Coen’s arsenal – Compel Soul – allows Coen to speak with the dead, helping him uncover clues and eventually locate a hidden crypt believed to hold the sword. What he finds is far more disturbing: Saint Mihai, once revered, had been entombed alive after villagers discovered he was also a Dawnwalker. Starved during the day and regenerating at night, Mihai slowly lost his sanity. When freed, he attacks Coen using the full range of Dawnwalker abilities. If you find him at night instead, Mihai appears in his vampire form, offering an entirely different encounter. The nearly 50-minute demo offers an extended look at The Blood of Dawnwalker’s design — from its dual gameplay loop and time-based structure to its focus on player agency and narrative depth. There is no single path through this story, and every decision, including inaction, shapes the journey. Slated for release in 2026, The Blood of Dawnwalker aims to deliver rich storytelling, immersive combat, and an open world where narrative truly takes the lead. The Blood of Dawnwalker is coming to PlayStation 5 in 2026; wishlist it now to stay updated and be among the first to step into Coen’s journey! #inside #blood #dawnwalkers #narrative #sandbox
    Inside The Blood of Dawnwalker’s narrative sandbox and dual gameplay
    blog.playstation.com
    I’m Mateusz Tomaszkiewicz, Creative Director at Rebel Wolves, working on The Blood of Dawnwalker — a story-driven, open-world RPG set in 14th-century Europe. We recently showcased a closer look at the gameplay at gamescom and wanted to share some of the exciting new details with you. Entering the narrative sandbox You play as Coen, the game’s protagonist and the titular Dawnwalker. While the exact circumstances remain a mystery, an attempt to turn Coen into a vampire fails. As a result, he exists between two worlds — human during the day, vampire at night. This duality sits at the heart of the gameplay, with each form offering distinct skills and abilities. Most quests can be approached during either day or night, creating significantly different experiences. One of the key mechanics is what we call the “narrative sandbox.” Once the prologue concludes, you have 30 days and nights to rescue Coen’s family from Brencis, a centuries-old vampire and former Roman senator, and his inner circle. Only major actions move the clock forward, and you’re always informed how much time an activity will consume. Roaming the open world does not advance time, giving you the freedom to explore without pressure. Time works more like a currency than a countdown. What truly defines the narrative sandbox is the freedom it gives you to shape the story on your own terms. Quests can be completed in any order, skipped entirely, or even never discovered at all depending on your choices. Many characters can be killed, with their absence reshaping events and relationships. There are often multiple paths to achieve the same goal, and even inaction is a choice — one the world around Coen will recognize and respond to. The result is a deeply reactive narrative structure that encourages experimentation and makes every playthrough unique. Human by day. Vampire by night. The newly revealed quest takes place on day eight of Coen’s journey. By then, several quests have been completed and Coen is visibly more powerful. We first see him at night, navigating the capital city of Svartrau and using vampiric abilities like Shadowstep — a short-range teleport that lets him instantly reposition. It’s invaluable not only in combat, allowing him to flank enemies or close the gap on patrolling guards, but also for exploration and stealth. With Shadowstep, Coen can reach scaffolding high on the cathedral’s walls, leap between rooftops, or slip into otherwise inaccessible balconies and ledges, opening up new routes and opportunities to approach objectives. Combat remains fluid and dynamic, blending physical strength with supernatural powers. His vampire form isn’t overpowered, but it adds a distinct tactical layer. The quest involves infiltrating the city’s cathedral, where Coen encounters Xanthe — an ancient Greek vampire and Brencis’ most powerful ally. To show how time of day affects gameplay, we then reload a save to experience a daytime version of the quest. This time, the goal is to locate the legendary sword of Saint Mihai, the cathedral’s patron. In his human form, Coen leans more on swordplay and dark human magic, unavailable as a vampire. Combat is fast and responsive, with directional attacks and flexible blocking. You can block easily with a single button or use directional input for more precision and control, accommodating both story-focused players and those seeking a challenge. Fighting the living, the dead, and everything In between Once the enemies are defeated, Coen enters Svartrau during the day. The streets are bustling with life — townsfolk fill the squares, merchants trade goods, and ambient conversations hint at the uneasy coexistence under vampire rule. After roaming the vibrant streets, Coen goes to the cathedral. Inside, we witness a chilling ritual called the Blood Baptism, one of Brencis’ ways of twisting existing traditions to maintain control. After a tense dialogue sequence that nearly exposes Coen, the sword quest resumes. This leads to a battle with Muron, a creature born from a failed vampire transformation. Unlike Coen, Muron does not become a Dawnwalker but a wild, unstable monster with unpredictable powers. Another hex in Coen’s arsenal – Compel Soul – allows Coen to speak with the dead, helping him uncover clues and eventually locate a hidden crypt believed to hold the sword. What he finds is far more disturbing: Saint Mihai, once revered, had been entombed alive after villagers discovered he was also a Dawnwalker. Starved during the day and regenerating at night, Mihai slowly lost his sanity. When freed, he attacks Coen using the full range of Dawnwalker abilities. If you find him at night instead, Mihai appears in his vampire form, offering an entirely different encounter. The nearly 50-minute demo offers an extended look at The Blood of Dawnwalker’s design — from its dual gameplay loop and time-based structure to its focus on player agency and narrative depth. There is no single path through this story, and every decision, including inaction, shapes the journey. Slated for release in 2026, The Blood of Dawnwalker aims to deliver rich storytelling, immersive combat, and an open world where narrative truly takes the lead. The Blood of Dawnwalker is coming to PlayStation 5 in 2026; wishlist it now to stay updated and be among the first to step into Coen’s journey!
    2 Comments ·0 Shares
More Results
ollo https://www.ollo.ws