• يا جماعة، تذكرت الماتش اللي شفتو بين ليفربول و ساوثامبتون! كنا في الكافيه، و كل واحد يحكي على فريقه المفضل، والشغف كان في القمة. اليوم راح نتكلم على نتيجة قرعة الدور الثالث من كأس الرابطة الإنجليزية، وين ليفربول راح يواجه ساوثامبتون، موضوع مثير صح؟

    المقال يتناول التفاصيل الكاملة للمباراة، التحضيرات، و توقعات الجماهير. شخصياً، عندي ذكريات رائعة مع مانشستر يونايتد و ليفربول، دائما كان عندي حماس كبير في هالمباريات.

    ما تنساوش، المباراة هادي راح تكون فرصة كبيرة للفوز أو الخسارة، و كل واحد عنده رأي. المهم هو الشغف اللي يجمعنا كعشاق كرة القدم.

    https://news.google.com/atom/articles/CBMixwRBVV95cUxPV0lsWFVmWlZ3SEotZF8tYnFBbWUwWTRRUzEyaUxDcm5NMFdNSnV0TmFLTWhYak
    يا جماعة، تذكرت الماتش اللي شفتو بين ليفربول و ساوثامبتون! كنا في الكافيه، و كل واحد يحكي على فريقه المفضل، والشغف كان في القمة. اليوم راح نتكلم على نتيجة قرعة الدور الثالث من كأس الرابطة الإنجليزية، وين ليفربول راح يواجه ساوثامبتون، موضوع مثير صح؟ المقال يتناول التفاصيل الكاملة للمباراة، التحضيرات، و توقعات الجماهير. شخصياً، عندي ذكريات رائعة مع مانشستر يونايتد و ليفربول، دائما كان عندي حماس كبير في هالمباريات. ما تنساوش، المباراة هادي راح تكون فرصة كبيرة للفوز أو الخسارة، و كل واحد عنده رأي. المهم هو الشغف اللي يجمعنا كعشاق كرة القدم. https://news.google.com/atom/articles/CBMixwRBVV95cUxPV0lsWFVmWlZ3SEotZF8tYnFBbWUwWTRRUzEyaUxDcm5NMFdNSnV0TmFLTWhYak
    Like
    Love
    Wow
    Sad
    Angry
    337
    · 1 Commentaires ·0 Parts
  • صاير كل يوم نسمع على الشركات الكبيرة وبالخصوص Intel، اللي راهي تتخبط في مشاكيل كبيرة. بصح، واش رايكم إذا نقولولكم بلي الحكومة الأمريكية موش هي الحل اللي تحتاجو؟

    المقال يتكلم على أن Intel ما تحتاجش فلوس، بل تحتاج تفكر كيفاش تجيب الزبائن لقطاع الـ foundry تاعها. يعني ماشي كلشي بالفلوس، لازم يكون عندك استراتيجية واضحة واهتمام من السوق.

    شخصيا، كنت نتابع بعض الشركات اللي نجحت بفضل الابتكار والتسويق الجيد، ونعرف بلي النقطة هذي مهمة بزاف. إذا ما عندكش عرض قوي، حتى الفلوس ما تنفعكش!

    فكروا مليح في الطريقة اللي تتجهولها الشركات الكبيرة في تطوير نفسها.

    https://techcrunch.com/2025/08/26/why-the-u-s-government-is-not-the-savior-intel-needs/

    #Intel #تكنولوجيا #Foundry #ابتكار #سياسة
    🏙️ صاير كل يوم نسمع على الشركات الكبيرة وبالخصوص Intel، اللي راهي تتخبط في مشاكيل كبيرة. بصح، واش رايكم إذا نقولولكم بلي الحكومة الأمريكية موش هي الحل اللي تحتاجو؟ المقال يتكلم على أن Intel ما تحتاجش فلوس، بل تحتاج تفكر كيفاش تجيب الزبائن لقطاع الـ foundry تاعها. يعني ماشي كلشي بالفلوس، لازم يكون عندك استراتيجية واضحة واهتمام من السوق. شخصيا، كنت نتابع بعض الشركات اللي نجحت بفضل الابتكار والتسويق الجيد، ونعرف بلي النقطة هذي مهمة بزاف. إذا ما عندكش عرض قوي، حتى الفلوس ما تنفعكش! فكروا مليح في الطريقة اللي تتجهولها الشركات الكبيرة في تطوير نفسها. https://techcrunch.com/2025/08/26/why-the-u-s-government-is-not-the-savior-intel-needs/ #Intel #تكنولوجيا #Foundry #ابتكار #سياسة
    techcrunch.com
    Intel doesn't need cash. Instead, the struggling semiconductor giant needs to figure out how to drum up interest for its foundry business.
    Like
    Love
    Wow
    Sad
    Angry
    1KB
    · 1 Commentaires ·0 Parts
  • Fur Grooming Techniques For Realistic Stitch In Blender

    IntroductionHi everyone! My name is Oleh Yakushev, and I'm a 3D Artist from Ukraine. My journey into 3D began just three years ago, when I was working as a mobile phone salesperson at a shopping mall. In 2022, during one slow day at work, I noticed a colleague learning Python. We started talking about life goals. I told him I wanted to switch careers, to do something creative, but programming wasn't really my thing.He asked me a simple question: "Well, what do you actually enjoy doing?"I said, "Video games. I love video games. But I don't have time to learn how to make them, I've got a job, a family, and a kid."Then he hit me with something that really shifted my whole perspective."Oleh, do you play games on your PlayStation?"I said, "Of course."He replied, "Then why not take the time you spend playing and use it to learn how to make games?"That moment flipped a switch in my mind. I realized that I did have time, it was just a matter of how I used it. If I really wanted to learn, I could find a way. At the time, I didn't even own a computer. But where there's a will, there's a way: I borrowed my sister's laptop for a month and started following beginner 3D tutorials on YouTube. Every night after work, once my family went to sleep, I'd sit in the kitchen and study. I stayed up until 2 or 3 AM, learning Blender basics. Then I'd sleep for a few hours before waking up at 6 AM to go back to work. That's how I spent my first few months in 3D, studying every single night.3D completely took over my life. During lunch breaks, I watched 3D videos, on the bus, I scrolled through 3D TikToks, at home, I took 3D courses, and the word "3D" just became a constant in my vocabulary.After a few months of learning the basics, I started building my portfolio, which looks pretty funny to me now. But at the time, it was a real sign of how committed I was. Eventually, someone reached out to me through Behance, offering my first freelance opportunity. And thatэs how my journey began, from mall clerk to 3D artist. It's been a tough road, full of burnout, doubts, and late nights... but also full of curiosity, growth, and hope. And I wouldn't trade it for anything.The Stitch ProjectI've loved Stitch since I was a kid. I used to watch the cartoons, play the video games, and he always felt like such a warm, funny, chill, and at the same time, strong character. So once I reached a certain level in 3D, I decided to recreate Stitch.Back then, my skills only allowed me to make him in a stylized cartoonish style, no fur, no complex detailing, no advanced texturing, I just didn't have the experience. Surprisingly, the result turned out pretty decent. Even now, I sometimes get comments that my old Stitch still looks quite cute. Though honestly, I wouldn't say that myself anymore. Two years have passed since I made that first Stitch, it was back in 2023. And in 2025, I decided it was time to challenge myself.At that point, I had just completed an intense grooming course. Grooming always intimidated me, it felt really complex. I avoided it on commercial projects, made a few failed attempts for my portfolio, and overall tried to steer clear of any tasks where grooming was required. But eventually, I found the strength to face it.I pushed myself to learn how to make great fur, and I did. I finally understood how the grooming system works, grasped the logic, the tools, and the workflow. And after finishing the course, I wanted to lock in all that knowledge by creating a full personal project from scratch.So my goal was to make a character from the ground up, where the final stage would be grooming. And without thinking too long, I chose Stitch.First, because I truly love the character. Second, I wanted to clearly see my own progress over the past two years. Third, I needed to put my new skills to the test and find out whether my training had really paid off.ModelingI had a few ideas for how to approach the base mesh for this project. First, to model everything completely from scratch, starting with a sphere. Second, to reuse my old Stitch model and upgrade it.But then an idea struck me: why not test how well AI could handle a base mesh? I gathered some references and tried generating a base mesh using AI, uploading Stitch visuals as a guide. As you can see from the screenshot, the result was far from usable. So I basically ended up doing everything from scratch anyway.So, I went back to basics: digging through ArtStation and Pinterest, collecting references. Since over the last two years, I had not only learned grooming but also completely changed my overall approach to character creation, it was important for me to make a more detailed model, even if much of it would be hidden under fur.The first Stitch was sculpted in Blender, with all the limitations that come with sculpting in it. But since then, I've leveled up significantly and switched to more advanced tools. So this second version of Stitch was born in ZBrush. By the time I started working on this Stitch, ZBrush had already become my second main workspace. I've used it to deliver tons of commercial projects, I work in it almost daily, and most of my portfolio was created using this tool. I found some great reference images showing Stitch's body structure. Among them were official movie references and a stunning high-poly model created by Juan Hernández, a version of Stitch without fur. That model became my primary reference for sculpting.Truth is, Stitch's base form is quite simple, so blocking out the shape didn't take too long. When blocking, I use Blender in combination with ZBrush:I work with primary forms in ZBrushThen check proportions in BlenderFix mistakes, tweak volumes, and refine the silhouetteSince Stitch's shape isn't overly complex, I broke him down into three main sculpting parts:The body: arms, legs, head, and earsThe nose, eyes, and mouth cavityWhile planning the sculpt, I already knew I'd be rigging Stitch, both body and facial rig. So I started sculpting with his mouth open.While studying various references, I noticed something interesting. Stitch from promotional posters, Stitch from the movie, and Stitch as recreated by different artists on ArtStation all look very different from one another. What surprised me the most was how different the promo version of Stitch is compared to the one in the actual movie. They are essentially two separate models:Different proportionsDifferent shapesDifferent texturesEven different fur and overall designThis presented a creative challenge, I had to develop my own take on Stitch's design. Sometimes I liked the way the teeth were done in one version, in another, the eye placement, in another, the fur shape, or the claw design on hands and feet.At first, considering that Stitch is completely covered in fur from head to toe, sculpting his underlying anatomy seemed pointless. I kept asking myself: "Why sculpt muscles and skin detail if everything will be hidden under fur anyway?"But eventually, I found a few solid answers for myself. First, having a defined muscle structure actually makes the fur grooming process easier. That's because fur often follows the flow of muscle lines, so having those muscles helps guide fur direction more accurately across the character's body.Second, it's great anatomy practice, and practice is never a waste. So, I found a solid anatomical reference of Stitch with clearly visible muscle groups and tried to recreate that structure as closely as possible in my own sculpt.In the end, I had to develop a full visual concept by combining elements from multiple versions of Stitch. Through careful reference work and constantly switching between Blender and ZBrush, I gradually, but intentionally, built up the body and overall look of our favorite fluffy alien.Topology & UVsThroughout the sculpting process, I spent quite a bit of time thinking about topology. I was looking for the most balanced solution between quality and production time. Normally, I do manual retopology for my characters, but this time, I knew it would take too much time, and honestly, I didn't have that luxury.So I decided to generate the topology using ZBrush's tools. I split the model into separate parts using Polygroups, assigning individual groups for the ears, the head, the torso, the arms, the legs, and each of Stitch's fingers.With the Polygroups in place, I used ZRemesher with Keep Groups enabled and smoothing on group borders. This gave me a clean and optimized mesh that was perfect for UV unwrapping.Of course, this kind of auto-retopology isn't a full substitute for manual work, but it saved me a huge amount of time, and the quality was still high enough for what I needed. However, there was one tricky issue. Although Stitch looks symmetrical at first glance, his ears are actually asymmetrical. The right ear has a scar on the top, while the left has a scar on the bottomBecause of that, I couldn't just mirror one side in ZBrush without losing those unique features. Here's what I ended up doing: I created a symmetrical model with the right ear, then another symmetrical model with the left ear. I brought both into Blender, detached the left ear from one model, and attached it to the body of the other one. This way, I got a clean, symmetrical base mesh with asymmetrical ears, preserving both topology and detail. And thanks to the clean polygroup-based layout, I was able to unwrap the UVs with nice, even seams and clean islands.When it came to UV mapping, I divided Stitch into two UDIM tiles:The first UDIM includes the head with ears, torso, arms, and legs.The second UDIM contains all the additional parts: teeth, tongue, gums, claws, and noseSince the nose is one of the most important details, I allocated the largest space to it, which helped me to better capture its intricate details.As for the eyes, I used procedural eyes, so there was no need to assign UV space or create a separate UDIM for texturing them. To achieve this, I used the Tiny Eye add-on by tinynocky for Blender, which allows full control over procedural eyes and their parameters.This approach gave me high-quality eyes with customizable elements tailored exactly to my needs. As a result of all these steps, Stitch ended up with a symmetrical, optimized mesh, asymmetrical ears, and the body split across two UDIMs, one for the main body and one for the additional parts.TexturingWhen planning Stitch's texturing, I understood that the main body texture would be fairly simple, with much of the visual detail enhanced by the fur. However, there were some areas that required much more attention than the rest of the body. The textures for Stitch can be roughly divided into several main parts:The base body, which includes the primary color of his fur, along with additional shading like a lighter tone on the frontand a darker tone on the back and napeThe nose and ears, these zones, demanded separate focusAt the initial texturing/blocking stage, the ears looked too cartoony, which didn’t fit the style I wanted. So, I decided to push them towards a more realistic look. This involved removing bright colors, adding more variation in the roughness map, introducing variation in the base color, and making the ears visually more natural, layered, and textured on the surface. By combining smart materials and masks, I achieved the effect of "living" ears, slightly dirty and looking as natural as possible.The nose was a separate story. It occupies a significant part of the face and thus draws a lot of attention. While studying references, I noticed that the shape and texture of the nose vary a lot between different artists. Initially, I made it dog-like, with some wear and tear around the nostrils and base.For a long time, I thought this version was acceptable. But during test renders, I realized the nose needed improvement. So I reworked its texturing, aiming to make it more detailed. I divided the nose texture into four main layers:Base detail: Baked from the high-poly model. Over this, I applied a smart skin material that added characteristic bumps.Lighter layer: Applied via a mask using the AO channel. This darkened the crevices and brightened the bumps, creating a multi-layered effect.Organic detail: In animal references, I noticed slight redness in the nose area. I created another AO-masked layer with reddish capillaries visible through the bumps, adding depth and realism.Softness: To make the nose visually softer, like in references, I added a fill layer with only height enabled, used a paper texture as grayscale, and applied a blurred mask. This created subtle dents and wrinkles that softened the look.All textures were created in 4K resolution to achieve maximum detail. After finishing the main texturing stage, I add an Ambient Occlusion map on the final texture layer, activating only the Color channel, setting the blend mode to Multiply, and reducing opacity to about 35%. This adds volume and greatly improves the overall perception of the model.That covers the texturing of Stitch’s body. I also created a separate texture for the fur. This was simpler, I disabled unnecessary layers like ears and eyelids, and left only the base ones corresponding to the body’s color tones.During grooming, I also created textures for the fur's clamps and roughness. In Substance 3D Painter, I additionally painted masks for better fur detail.FurAnd finally, I moved on to the part that was most important to me, the very reason I started this project in the first place. Fur. This entire process was essentially a test of my fur grooming skills. After overcoming self-doubt, I trusted the process and relied on everything I had learned so far. Before diving into the grooming itself, I made sure to gather strong references. I searched for the highest quality and most inspiring examples I could find and analyzed them thoroughly. My goal was to clearly understand the direction of fur growth, its density and volume, the intensity of roughness, and the strength of clumping in different areas of Stitch's body.To create the fur, I used Blender and its Hair Particle System. The overall approach is similar to sculpting a high-detail model: work from broad strokes to finer details. So, the first step was blocking out the main flow and placement of the hair strands.At this point, I ran into a challenge: symmetry. Since the model was purposefully asymmetrical, the fur couldn't be mirrored cleanly. To solve this, I created a base fur blocking using Hair Guides with just two segments. After that, I split the fur into separate parts. I duplicated the main Particle System and created individual hair systems for each area where needed.In total, I broke Stitch's body into key sections: head, left ear, right ear, front torso, back torso, arms, hands, upper and lower legs, toes, and additional detailing layers. The final fur setup included 25 separate particle systems.To control fur growth, I used Weight Paint to fine-tune the influence on each body part individually. This separation gave me much more precision and allowed full control over every parameter of the fur on a per-section basis.The most challenging aspect of working with fur is staying patient and focused. Detail is absolutely critical because the overall picture is built entirely from tiny, subtle elements. Once the base layer was complete, I moved on to refining the fur based on my references.The most complex areas turned out to be the front of the torso and the face. When working on the torso, my goal was to create a smooth gradient, from thick, clumped fur on the chest to shorter, softer fur on the stomach.Step by step, I adjusted the transitions, directions, clumps, and volumes to achieve that look. Additionally, I used the fur itself to subtly enhance Stitch's silhouette, making his overall shape feel sharper, more expressive, and visually engaging.During fur development, I used texture maps to control the intensity of the Roughness and Clump parameters. This gave me a high degree of flexibility, textures drove these attributes across the entire model. In areas where stronger clumping or roughness was needed, I used brighter values; in zones requiring a softer look, darker values. This approach allowed for fine-tuned micro-level control of the fur shader and helped achieve a highly realistic appearance in renders.The face required special attention: the fur had to be neat, evenly distributed, and still visually appealing. The biggest challenge here was working around the eye area. Even with properly adjusted Weight Paint, interpolation sometimes caused strands to creep into the eyes.I spent a lot of time cleaning up this region to get an optimal result. I also had to revisit certain patches that looked bald, even though interpolation and weight painting were set correctly, because the fur didn't render properly there. These areas needed manual fixing.As part of the detailing stage, I also increased the number of segments in the Hair Guides.While the blocking phase only used two segments, I went up to three, and in some cases even five, for more complex regions. This gave me much more control over fur shape and flow.The tiniest details really matter, so I added extra fur layers with thinner, more chaotic strands extending slightly beyond the main silhouette. These micro-layers significantly improved the texture depth and boosted the overall realism.Aside from the grooming itself, I paid special attention to the fur material setup, as the shader plays a critical role in the final visual quality of the render. It's not enough to simply plug a color texture into a Principled BSDF node and call it done.I built a more complex shader, giving me precise control over various attributes. For example, I implemented subtle color variation across individual strands, along with darkening near the roots and a gradual brightening toward the tips. This helped add visual depth and made the fur look significantly more natural and lifelike.Working on the fur took up nearly half of the total time I spent on the entire model. And I'm genuinely happy with the result, this stage confirmed that the training I've gone through was solid and that I’m heading in the right direction with my artistic development.Rigging, Posing & SceneOnce I finished working on the fur, I rendered several 4K test shots from different angles to make sure every detail looked the way I intended. When I was fully satisfied with the results, it was time to move on to rigging.I divided the rigging process into three main parts:Body rig, for posing and positioning the characterFacial rig, for expressions and emotionsEar rig, for dynamic ear controlRigging isn't something I consider my strongest skill, but as a 3D generalist, I had to dive into many technical aspects of it. For the ears, I set up a relatively simple system with several bones connected using inverse kinematics. This gave me flexible and intuitive control during posing and allowed for the addition of dynamic movement in animation.For facial rigging, I used the FaceIt add-on, which generates a complete facial control system for mouth, eyes, and tongue. It sped up the process significantly and gave me more precision. For the body, I used the ActorCore Rig by NVIDIA, then converted it to Rigify, which gave me a familiar interface and flexible control over poses.Posing is one of my favorite stages, it's when the character really comes to life. As usual, it started with gathering references. Honestly, it was hard to pick the final poses, Stitch is so expressive and full of personality that I wanted to try hundreds of them. But I focused on those that best conveyed the spirit and mood of the character. Some poses I reworked to fit my style rather than copying directly. For example, in the pose where Stitch licks his nose, I added drool and a bit of "green slime" for comedic effect. To capture motion, I tilted his head back and made the ears fly upward, creating a vivid, emotional snapshot.Just like in sculpting or grooming, minor details make a big difference in posing. Examples include: a slight asymmetry in the facial expression, a raised corner of the mouth, one eye squinting a little more than the other, and ears set at slightly different angles.These are subtle things that might not be noticed immediately, but they’re the key to making the character feel alive and believable.For each pose, I created a separate scene and collection in Blender, including the character, specific lighting setup, and a simple background or environment. This made it easy to return to any scene later, to adjust lighting, reposition the character, or tweak the background.In one of the renders, which I used as the cover image, Stitch is holding a little frog.I want to clearly note that the 3D model of the frog is not mine, full credit goes to the original author of the asset.At first, I wanted to build a full environment around Stitch, to create a scene that would feel like a frame from a film. But after carefully evaluating my skills and priorities, I decided that a weak environment would only detract from the strength of the character. So I opted for a simple, neutral backdrop, designed to keep all the focus on Stitch himself.Rendering, Lighting & Post-ProcessingWhen the character is complete, posed expressively, and integrated into the scene, there's one final step: lighting. Lighting isn't just a technical element of the scene — it’s a full-fledged stage of the 3D pipeline. It doesn't just illuminate; it paints. Proper lighting can highlight the personality of the character, emphasize forms, and create atmosphere.For all my renders, I rely on the classic three-point lighting setup: Key Light, Fill Light, and Rim Light.While this setup is well-known, it remains highly effective. When done thoughtfully, with the right intensity, direction, and color temperature, it creates a strong light-shadow composition that brings the model to life. In addition to the three main lights, I also use an HDRI map, but with very low intensity, around 0.3, just enough to subtly enrich the ambient light without overpowering the scene.Once everything is set, it's time to hit Render and wait for the result. Due to hardware limitations, I wasn’t able to produce full animated shots with fur. Rendering a single 4K image with fur took over an hour, so I limited myself to a 360° turnaround and several static renders.I don't spend too much time on post-processing, just basic refinements in Photoshop. Slight enhancement of the composition, gentle shadow adjustments, color balance tweaks, and adding a logo. Everything is done subtly, nothing overprocessed. The goal is simply to support and enhance what’s already there.Final ThoughtsThis project has been an incredible experience. Although it was my second time creating Stitch, this time the process felt completely different at every stage. And honestly, it wasn't easy.But that was exactly the point: to challenge myself. To reimagine something familiar, to try things I'd never done before, and to walk the full journey from start to finish. The fur, the heart of this project, was especially meaningful to me. It’s what started it all. I poured a lot into this model: time, effort, emotion, and even doubts. But at the same time, I brought all my knowledge, skills, and experience into it.This work became a mirror of my progress from 2023 to 2025. I can clearly see how far I've come, and that gives me the motivation to keep going. Every hour of learning and practice paid off, the results speak for themselves. This model was created for my portfolio. I don't plan to use it commercially, unless, of course, a studio actually wants to license it for a new filmIt's been a long road: challenging, sometimes exhausting, but above all inspiring and exciting. I know there's still a lot to learn. Many things to study, improve, and polish to perfection. But I'm already on that path, and I'm not stopping.Oleh Yakushev, 3D Character ArtistInterview conducted by Gloria Levine
    #fur #grooming #techniques #realistic #stitch
    Fur Grooming Techniques For Realistic Stitch In Blender
    IntroductionHi everyone! My name is Oleh Yakushev, and I'm a 3D Artist from Ukraine. My journey into 3D began just three years ago, when I was working as a mobile phone salesperson at a shopping mall. In 2022, during one slow day at work, I noticed a colleague learning Python. We started talking about life goals. I told him I wanted to switch careers, to do something creative, but programming wasn't really my thing.He asked me a simple question: "Well, what do you actually enjoy doing?"I said, "Video games. I love video games. But I don't have time to learn how to make them, I've got a job, a family, and a kid."Then he hit me with something that really shifted my whole perspective."Oleh, do you play games on your PlayStation?"I said, "Of course."He replied, "Then why not take the time you spend playing and use it to learn how to make games?"That moment flipped a switch in my mind. I realized that I did have time, it was just a matter of how I used it. If I really wanted to learn, I could find a way. At the time, I didn't even own a computer. But where there's a will, there's a way: I borrowed my sister's laptop for a month and started following beginner 3D tutorials on YouTube. Every night after work, once my family went to sleep, I'd sit in the kitchen and study. I stayed up until 2 or 3 AM, learning Blender basics. Then I'd sleep for a few hours before waking up at 6 AM to go back to work. That's how I spent my first few months in 3D, studying every single night.3D completely took over my life. During lunch breaks, I watched 3D videos, on the bus, I scrolled through 3D TikToks, at home, I took 3D courses, and the word "3D" just became a constant in my vocabulary.After a few months of learning the basics, I started building my portfolio, which looks pretty funny to me now. But at the time, it was a real sign of how committed I was. Eventually, someone reached out to me through Behance, offering my first freelance opportunity. And thatэs how my journey began, from mall clerk to 3D artist. It's been a tough road, full of burnout, doubts, and late nights... but also full of curiosity, growth, and hope. And I wouldn't trade it for anything.The Stitch ProjectI've loved Stitch since I was a kid. I used to watch the cartoons, play the video games, and he always felt like such a warm, funny, chill, and at the same time, strong character. So once I reached a certain level in 3D, I decided to recreate Stitch.Back then, my skills only allowed me to make him in a stylized cartoonish style, no fur, no complex detailing, no advanced texturing, I just didn't have the experience. Surprisingly, the result turned out pretty decent. Even now, I sometimes get comments that my old Stitch still looks quite cute. Though honestly, I wouldn't say that myself anymore. Two years have passed since I made that first Stitch, it was back in 2023. And in 2025, I decided it was time to challenge myself.At that point, I had just completed an intense grooming course. Grooming always intimidated me, it felt really complex. I avoided it on commercial projects, made a few failed attempts for my portfolio, and overall tried to steer clear of any tasks where grooming was required. But eventually, I found the strength to face it.I pushed myself to learn how to make great fur, and I did. I finally understood how the grooming system works, grasped the logic, the tools, and the workflow. And after finishing the course, I wanted to lock in all that knowledge by creating a full personal project from scratch.So my goal was to make a character from the ground up, where the final stage would be grooming. And without thinking too long, I chose Stitch.First, because I truly love the character. Second, I wanted to clearly see my own progress over the past two years. Third, I needed to put my new skills to the test and find out whether my training had really paid off.ModelingI had a few ideas for how to approach the base mesh for this project. First, to model everything completely from scratch, starting with a sphere. Second, to reuse my old Stitch model and upgrade it.But then an idea struck me: why not test how well AI could handle a base mesh? I gathered some references and tried generating a base mesh using AI, uploading Stitch visuals as a guide. As you can see from the screenshot, the result was far from usable. So I basically ended up doing everything from scratch anyway.So, I went back to basics: digging through ArtStation and Pinterest, collecting references. Since over the last two years, I had not only learned grooming but also completely changed my overall approach to character creation, it was important for me to make a more detailed model, even if much of it would be hidden under fur.The first Stitch was sculpted in Blender, with all the limitations that come with sculpting in it. But since then, I've leveled up significantly and switched to more advanced tools. So this second version of Stitch was born in ZBrush. By the time I started working on this Stitch, ZBrush had already become my second main workspace. I've used it to deliver tons of commercial projects, I work in it almost daily, and most of my portfolio was created using this tool. I found some great reference images showing Stitch's body structure. Among them were official movie references and a stunning high-poly model created by Juan Hernández, a version of Stitch without fur. That model became my primary reference for sculpting.Truth is, Stitch's base form is quite simple, so blocking out the shape didn't take too long. When blocking, I use Blender in combination with ZBrush:I work with primary forms in ZBrushThen check proportions in BlenderFix mistakes, tweak volumes, and refine the silhouetteSince Stitch's shape isn't overly complex, I broke him down into three main sculpting parts:The body: arms, legs, head, and earsThe nose, eyes, and mouth cavityWhile planning the sculpt, I already knew I'd be rigging Stitch, both body and facial rig. So I started sculpting with his mouth open.While studying various references, I noticed something interesting. Stitch from promotional posters, Stitch from the movie, and Stitch as recreated by different artists on ArtStation all look very different from one another. What surprised me the most was how different the promo version of Stitch is compared to the one in the actual movie. They are essentially two separate models:Different proportionsDifferent shapesDifferent texturesEven different fur and overall designThis presented a creative challenge, I had to develop my own take on Stitch's design. Sometimes I liked the way the teeth were done in one version, in another, the eye placement, in another, the fur shape, or the claw design on hands and feet.At first, considering that Stitch is completely covered in fur from head to toe, sculpting his underlying anatomy seemed pointless. I kept asking myself: "Why sculpt muscles and skin detail if everything will be hidden under fur anyway?"But eventually, I found a few solid answers for myself. First, having a defined muscle structure actually makes the fur grooming process easier. That's because fur often follows the flow of muscle lines, so having those muscles helps guide fur direction more accurately across the character's body.Second, it's great anatomy practice, and practice is never a waste. So, I found a solid anatomical reference of Stitch with clearly visible muscle groups and tried to recreate that structure as closely as possible in my own sculpt.In the end, I had to develop a full visual concept by combining elements from multiple versions of Stitch. Through careful reference work and constantly switching between Blender and ZBrush, I gradually, but intentionally, built up the body and overall look of our favorite fluffy alien.Topology & UVsThroughout the sculpting process, I spent quite a bit of time thinking about topology. I was looking for the most balanced solution between quality and production time. Normally, I do manual retopology for my characters, but this time, I knew it would take too much time, and honestly, I didn't have that luxury.So I decided to generate the topology using ZBrush's tools. I split the model into separate parts using Polygroups, assigning individual groups for the ears, the head, the torso, the arms, the legs, and each of Stitch's fingers.With the Polygroups in place, I used ZRemesher with Keep Groups enabled and smoothing on group borders. This gave me a clean and optimized mesh that was perfect for UV unwrapping.Of course, this kind of auto-retopology isn't a full substitute for manual work, but it saved me a huge amount of time, and the quality was still high enough for what I needed. However, there was one tricky issue. Although Stitch looks symmetrical at first glance, his ears are actually asymmetrical. The right ear has a scar on the top, while the left has a scar on the bottomBecause of that, I couldn't just mirror one side in ZBrush without losing those unique features. Here's what I ended up doing: I created a symmetrical model with the right ear, then another symmetrical model with the left ear. I brought both into Blender, detached the left ear from one model, and attached it to the body of the other one. This way, I got a clean, symmetrical base mesh with asymmetrical ears, preserving both topology and detail. And thanks to the clean polygroup-based layout, I was able to unwrap the UVs with nice, even seams and clean islands.When it came to UV mapping, I divided Stitch into two UDIM tiles:The first UDIM includes the head with ears, torso, arms, and legs.The second UDIM contains all the additional parts: teeth, tongue, gums, claws, and noseSince the nose is one of the most important details, I allocated the largest space to it, which helped me to better capture its intricate details.As for the eyes, I used procedural eyes, so there was no need to assign UV space or create a separate UDIM for texturing them. To achieve this, I used the Tiny Eye add-on by tinynocky for Blender, which allows full control over procedural eyes and their parameters.This approach gave me high-quality eyes with customizable elements tailored exactly to my needs. As a result of all these steps, Stitch ended up with a symmetrical, optimized mesh, asymmetrical ears, and the body split across two UDIMs, one for the main body and one for the additional parts.TexturingWhen planning Stitch's texturing, I understood that the main body texture would be fairly simple, with much of the visual detail enhanced by the fur. However, there were some areas that required much more attention than the rest of the body. The textures for Stitch can be roughly divided into several main parts:The base body, which includes the primary color of his fur, along with additional shading like a lighter tone on the frontand a darker tone on the back and napeThe nose and ears, these zones, demanded separate focusAt the initial texturing/blocking stage, the ears looked too cartoony, which didn’t fit the style I wanted. So, I decided to push them towards a more realistic look. This involved removing bright colors, adding more variation in the roughness map, introducing variation in the base color, and making the ears visually more natural, layered, and textured on the surface. By combining smart materials and masks, I achieved the effect of "living" ears, slightly dirty and looking as natural as possible.The nose was a separate story. It occupies a significant part of the face and thus draws a lot of attention. While studying references, I noticed that the shape and texture of the nose vary a lot between different artists. Initially, I made it dog-like, with some wear and tear around the nostrils and base.For a long time, I thought this version was acceptable. But during test renders, I realized the nose needed improvement. So I reworked its texturing, aiming to make it more detailed. I divided the nose texture into four main layers:Base detail: Baked from the high-poly model. Over this, I applied a smart skin material that added characteristic bumps.Lighter layer: Applied via a mask using the AO channel. This darkened the crevices and brightened the bumps, creating a multi-layered effect.Organic detail: In animal references, I noticed slight redness in the nose area. I created another AO-masked layer with reddish capillaries visible through the bumps, adding depth and realism.Softness: To make the nose visually softer, like in references, I added a fill layer with only height enabled, used a paper texture as grayscale, and applied a blurred mask. This created subtle dents and wrinkles that softened the look.All textures were created in 4K resolution to achieve maximum detail. After finishing the main texturing stage, I add an Ambient Occlusion map on the final texture layer, activating only the Color channel, setting the blend mode to Multiply, and reducing opacity to about 35%. This adds volume and greatly improves the overall perception of the model.That covers the texturing of Stitch’s body. I also created a separate texture for the fur. This was simpler, I disabled unnecessary layers like ears and eyelids, and left only the base ones corresponding to the body’s color tones.During grooming, I also created textures for the fur's clamps and roughness. In Substance 3D Painter, I additionally painted masks for better fur detail.FurAnd finally, I moved on to the part that was most important to me, the very reason I started this project in the first place. Fur. This entire process was essentially a test of my fur grooming skills. After overcoming self-doubt, I trusted the process and relied on everything I had learned so far. Before diving into the grooming itself, I made sure to gather strong references. I searched for the highest quality and most inspiring examples I could find and analyzed them thoroughly. My goal was to clearly understand the direction of fur growth, its density and volume, the intensity of roughness, and the strength of clumping in different areas of Stitch's body.To create the fur, I used Blender and its Hair Particle System. The overall approach is similar to sculpting a high-detail model: work from broad strokes to finer details. So, the first step was blocking out the main flow and placement of the hair strands.At this point, I ran into a challenge: symmetry. Since the model was purposefully asymmetrical, the fur couldn't be mirrored cleanly. To solve this, I created a base fur blocking using Hair Guides with just two segments. After that, I split the fur into separate parts. I duplicated the main Particle System and created individual hair systems for each area where needed.In total, I broke Stitch's body into key sections: head, left ear, right ear, front torso, back torso, arms, hands, upper and lower legs, toes, and additional detailing layers. The final fur setup included 25 separate particle systems.To control fur growth, I used Weight Paint to fine-tune the influence on each body part individually. This separation gave me much more precision and allowed full control over every parameter of the fur on a per-section basis.The most challenging aspect of working with fur is staying patient and focused. Detail is absolutely critical because the overall picture is built entirely from tiny, subtle elements. Once the base layer was complete, I moved on to refining the fur based on my references.The most complex areas turned out to be the front of the torso and the face. When working on the torso, my goal was to create a smooth gradient, from thick, clumped fur on the chest to shorter, softer fur on the stomach.Step by step, I adjusted the transitions, directions, clumps, and volumes to achieve that look. Additionally, I used the fur itself to subtly enhance Stitch's silhouette, making his overall shape feel sharper, more expressive, and visually engaging.During fur development, I used texture maps to control the intensity of the Roughness and Clump parameters. This gave me a high degree of flexibility, textures drove these attributes across the entire model. In areas where stronger clumping or roughness was needed, I used brighter values; in zones requiring a softer look, darker values. This approach allowed for fine-tuned micro-level control of the fur shader and helped achieve a highly realistic appearance in renders.The face required special attention: the fur had to be neat, evenly distributed, and still visually appealing. The biggest challenge here was working around the eye area. Even with properly adjusted Weight Paint, interpolation sometimes caused strands to creep into the eyes.I spent a lot of time cleaning up this region to get an optimal result. I also had to revisit certain patches that looked bald, even though interpolation and weight painting were set correctly, because the fur didn't render properly there. These areas needed manual fixing.As part of the detailing stage, I also increased the number of segments in the Hair Guides.While the blocking phase only used two segments, I went up to three, and in some cases even five, for more complex regions. This gave me much more control over fur shape and flow.The tiniest details really matter, so I added extra fur layers with thinner, more chaotic strands extending slightly beyond the main silhouette. These micro-layers significantly improved the texture depth and boosted the overall realism.Aside from the grooming itself, I paid special attention to the fur material setup, as the shader plays a critical role in the final visual quality of the render. It's not enough to simply plug a color texture into a Principled BSDF node and call it done.I built a more complex shader, giving me precise control over various attributes. For example, I implemented subtle color variation across individual strands, along with darkening near the roots and a gradual brightening toward the tips. This helped add visual depth and made the fur look significantly more natural and lifelike.Working on the fur took up nearly half of the total time I spent on the entire model. And I'm genuinely happy with the result, this stage confirmed that the training I've gone through was solid and that I’m heading in the right direction with my artistic development.Rigging, Posing & SceneOnce I finished working on the fur, I rendered several 4K test shots from different angles to make sure every detail looked the way I intended. When I was fully satisfied with the results, it was time to move on to rigging.I divided the rigging process into three main parts:Body rig, for posing and positioning the characterFacial rig, for expressions and emotionsEar rig, for dynamic ear controlRigging isn't something I consider my strongest skill, but as a 3D generalist, I had to dive into many technical aspects of it. For the ears, I set up a relatively simple system with several bones connected using inverse kinematics. This gave me flexible and intuitive control during posing and allowed for the addition of dynamic movement in animation.For facial rigging, I used the FaceIt add-on, which generates a complete facial control system for mouth, eyes, and tongue. It sped up the process significantly and gave me more precision. For the body, I used the ActorCore Rig by NVIDIA, then converted it to Rigify, which gave me a familiar interface and flexible control over poses.Posing is one of my favorite stages, it's when the character really comes to life. As usual, it started with gathering references. Honestly, it was hard to pick the final poses, Stitch is so expressive and full of personality that I wanted to try hundreds of them. But I focused on those that best conveyed the spirit and mood of the character. Some poses I reworked to fit my style rather than copying directly. For example, in the pose where Stitch licks his nose, I added drool and a bit of "green slime" for comedic effect. To capture motion, I tilted his head back and made the ears fly upward, creating a vivid, emotional snapshot.Just like in sculpting or grooming, minor details make a big difference in posing. Examples include: a slight asymmetry in the facial expression, a raised corner of the mouth, one eye squinting a little more than the other, and ears set at slightly different angles.These are subtle things that might not be noticed immediately, but they’re the key to making the character feel alive and believable.For each pose, I created a separate scene and collection in Blender, including the character, specific lighting setup, and a simple background or environment. This made it easy to return to any scene later, to adjust lighting, reposition the character, or tweak the background.In one of the renders, which I used as the cover image, Stitch is holding a little frog.I want to clearly note that the 3D model of the frog is not mine, full credit goes to the original author of the asset.At first, I wanted to build a full environment around Stitch, to create a scene that would feel like a frame from a film. But after carefully evaluating my skills and priorities, I decided that a weak environment would only detract from the strength of the character. So I opted for a simple, neutral backdrop, designed to keep all the focus on Stitch himself.Rendering, Lighting & Post-ProcessingWhen the character is complete, posed expressively, and integrated into the scene, there's one final step: lighting. Lighting isn't just a technical element of the scene — it’s a full-fledged stage of the 3D pipeline. It doesn't just illuminate; it paints. Proper lighting can highlight the personality of the character, emphasize forms, and create atmosphere.For all my renders, I rely on the classic three-point lighting setup: Key Light, Fill Light, and Rim Light.While this setup is well-known, it remains highly effective. When done thoughtfully, with the right intensity, direction, and color temperature, it creates a strong light-shadow composition that brings the model to life. In addition to the three main lights, I also use an HDRI map, but with very low intensity, around 0.3, just enough to subtly enrich the ambient light without overpowering the scene.Once everything is set, it's time to hit Render and wait for the result. Due to hardware limitations, I wasn’t able to produce full animated shots with fur. Rendering a single 4K image with fur took over an hour, so I limited myself to a 360° turnaround and several static renders.I don't spend too much time on post-processing, just basic refinements in Photoshop. Slight enhancement of the composition, gentle shadow adjustments, color balance tweaks, and adding a logo. Everything is done subtly, nothing overprocessed. The goal is simply to support and enhance what’s already there.Final ThoughtsThis project has been an incredible experience. Although it was my second time creating Stitch, this time the process felt completely different at every stage. And honestly, it wasn't easy.But that was exactly the point: to challenge myself. To reimagine something familiar, to try things I'd never done before, and to walk the full journey from start to finish. The fur, the heart of this project, was especially meaningful to me. It’s what started it all. I poured a lot into this model: time, effort, emotion, and even doubts. But at the same time, I brought all my knowledge, skills, and experience into it.This work became a mirror of my progress from 2023 to 2025. I can clearly see how far I've come, and that gives me the motivation to keep going. Every hour of learning and practice paid off, the results speak for themselves. This model was created for my portfolio. I don't plan to use it commercially, unless, of course, a studio actually wants to license it for a new filmIt's been a long road: challenging, sometimes exhausting, but above all inspiring and exciting. I know there's still a lot to learn. Many things to study, improve, and polish to perfection. But I'm already on that path, and I'm not stopping.Oleh Yakushev, 3D Character ArtistInterview conducted by Gloria Levine #fur #grooming #techniques #realistic #stitch
    Fur Grooming Techniques For Realistic Stitch In Blender
    80.lv
    IntroductionHi everyone! My name is Oleh Yakushev, and I'm a 3D Artist from Ukraine. My journey into 3D began just three years ago, when I was working as a mobile phone salesperson at a shopping mall. In 2022, during one slow day at work, I noticed a colleague learning Python. We started talking about life goals. I told him I wanted to switch careers, to do something creative, but programming wasn't really my thing.He asked me a simple question: "Well, what do you actually enjoy doing?"I said, "Video games. I love video games. But I don't have time to learn how to make them, I've got a job, a family, and a kid."Then he hit me with something that really shifted my whole perspective."Oleh, do you play games on your PlayStation?"I said, "Of course."He replied, "Then why not take the time you spend playing and use it to learn how to make games?"That moment flipped a switch in my mind. I realized that I did have time, it was just a matter of how I used it. If I really wanted to learn, I could find a way. At the time, I didn't even own a computer. But where there's a will, there's a way: I borrowed my sister's laptop for a month and started following beginner 3D tutorials on YouTube. Every night after work, once my family went to sleep, I'd sit in the kitchen and study. I stayed up until 2 or 3 AM, learning Blender basics. Then I'd sleep for a few hours before waking up at 6 AM to go back to work. That's how I spent my first few months in 3D, studying every single night.3D completely took over my life. During lunch breaks, I watched 3D videos, on the bus, I scrolled through 3D TikToks, at home, I took 3D courses, and the word "3D" just became a constant in my vocabulary.After a few months of learning the basics, I started building my portfolio, which looks pretty funny to me now. But at the time, it was a real sign of how committed I was. Eventually, someone reached out to me through Behance, offering my first freelance opportunity. And thatэs how my journey began, from mall clerk to 3D artist. It's been a tough road, full of burnout, doubts, and late nights... but also full of curiosity, growth, and hope. And I wouldn't trade it for anything.The Stitch ProjectI've loved Stitch since I was a kid. I used to watch the cartoons, play the video games, and he always felt like such a warm, funny, chill, and at the same time, strong character. So once I reached a certain level in 3D, I decided to recreate Stitch.Back then, my skills only allowed me to make him in a stylized cartoonish style, no fur, no complex detailing, no advanced texturing, I just didn't have the experience. Surprisingly, the result turned out pretty decent. Even now, I sometimes get comments that my old Stitch still looks quite cute. Though honestly, I wouldn't say that myself anymore. Two years have passed since I made that first Stitch, it was back in 2023. And in 2025, I decided it was time to challenge myself.At that point, I had just completed an intense grooming course. Grooming always intimidated me, it felt really complex. I avoided it on commercial projects, made a few failed attempts for my portfolio, and overall tried to steer clear of any tasks where grooming was required. But eventually, I found the strength to face it.I pushed myself to learn how to make great fur, and I did. I finally understood how the grooming system works, grasped the logic, the tools, and the workflow. And after finishing the course, I wanted to lock in all that knowledge by creating a full personal project from scratch.So my goal was to make a character from the ground up, where the final stage would be grooming. And without thinking too long, I chose Stitch.First, because I truly love the character. Second, I wanted to clearly see my own progress over the past two years. Third, I needed to put my new skills to the test and find out whether my training had really paid off.ModelingI had a few ideas for how to approach the base mesh for this project. First, to model everything completely from scratch, starting with a sphere. Second, to reuse my old Stitch model and upgrade it.But then an idea struck me: why not test how well AI could handle a base mesh? I gathered some references and tried generating a base mesh using AI, uploading Stitch visuals as a guide. As you can see from the screenshot, the result was far from usable. So I basically ended up doing everything from scratch anyway.So, I went back to basics: digging through ArtStation and Pinterest, collecting references. Since over the last two years, I had not only learned grooming but also completely changed my overall approach to character creation, it was important for me to make a more detailed model, even if much of it would be hidden under fur.The first Stitch was sculpted in Blender, with all the limitations that come with sculpting in it. But since then, I've leveled up significantly and switched to more advanced tools. So this second version of Stitch was born in ZBrush. By the time I started working on this Stitch, ZBrush had already become my second main workspace. I've used it to deliver tons of commercial projects, I work in it almost daily, and most of my portfolio was created using this tool. I found some great reference images showing Stitch's body structure. Among them were official movie references and a stunning high-poly model created by Juan Hernández, a version of Stitch without fur. That model became my primary reference for sculpting.Truth is, Stitch's base form is quite simple, so blocking out the shape didn't take too long. When blocking, I use Blender in combination with ZBrush:I work with primary forms in ZBrushThen check proportions in BlenderFix mistakes, tweak volumes, and refine the silhouetteSince Stitch's shape isn't overly complex, I broke him down into three main sculpting parts:The body: arms, legs, head, and earsThe nose, eyes, and mouth cavityWhile planning the sculpt, I already knew I'd be rigging Stitch, both body and facial rig. So I started sculpting with his mouth open (to later close it and have more flexibility when it comes to rigging and deformation).While studying various references, I noticed something interesting. Stitch from promotional posters, Stitch from the movie, and Stitch as recreated by different artists on ArtStation all look very different from one another. What surprised me the most was how different the promo version of Stitch is compared to the one in the actual movie. They are essentially two separate models:Different proportionsDifferent shapesDifferent texturesEven different fur and overall designThis presented a creative challenge, I had to develop my own take on Stitch's design. Sometimes I liked the way the teeth were done in one version, in another, the eye placement, in another, the fur shape, or the claw design on hands and feet.At first, considering that Stitch is completely covered in fur from head to toe, sculpting his underlying anatomy seemed pointless. I kept asking myself: "Why sculpt muscles and skin detail if everything will be hidden under fur anyway?"But eventually, I found a few solid answers for myself. First, having a defined muscle structure actually makes the fur grooming process easier. That's because fur often follows the flow of muscle lines, so having those muscles helps guide fur direction more accurately across the character's body.Second, it's great anatomy practice, and practice is never a waste. So, I found a solid anatomical reference of Stitch with clearly visible muscle groups and tried to recreate that structure as closely as possible in my own sculpt.In the end, I had to develop a full visual concept by combining elements from multiple versions of Stitch. Through careful reference work and constantly switching between Blender and ZBrush, I gradually, but intentionally, built up the body and overall look of our favorite fluffy alien.Topology & UVsThroughout the sculpting process, I spent quite a bit of time thinking about topology. I was looking for the most balanced solution between quality and production time. Normally, I do manual retopology for my characters, but this time, I knew it would take too much time, and honestly, I didn't have that luxury.So I decided to generate the topology using ZBrush's tools. I split the model into separate parts using Polygroups, assigning individual groups for the ears, the head, the torso, the arms, the legs, and each of Stitch's fingers.With the Polygroups in place, I used ZRemesher with Keep Groups enabled and smoothing on group borders. This gave me a clean and optimized mesh that was perfect for UV unwrapping.Of course, this kind of auto-retopology isn't a full substitute for manual work, but it saved me a huge amount of time, and the quality was still high enough for what I needed. However, there was one tricky issue. Although Stitch looks symmetrical at first glance, his ears are actually asymmetrical. The right ear has a scar on the top, while the left has a scar on the bottomBecause of that, I couldn't just mirror one side in ZBrush without losing those unique features. Here's what I ended up doing: I created a symmetrical model with the right ear, then another symmetrical model with the left ear. I brought both into Blender, detached the left ear from one model, and attached it to the body of the other one. This way, I got a clean, symmetrical base mesh with asymmetrical ears, preserving both topology and detail. And thanks to the clean polygroup-based layout, I was able to unwrap the UVs with nice, even seams and clean islands.When it came to UV mapping, I divided Stitch into two UDIM tiles:The first UDIM includes the head with ears, torso, arms, and legs.The second UDIM contains all the additional parts: teeth, tongue, gums, claws, and nose (For the claws, I used overlapping UVs to preserve texel density for the other parts)Since the nose is one of the most important details, I allocated the largest space to it, which helped me to better capture its intricate details.As for the eyes, I used procedural eyes, so there was no need to assign UV space or create a separate UDIM for texturing them. To achieve this, I used the Tiny Eye add-on by tinynocky for Blender, which allows full control over procedural eyes and their parameters.This approach gave me high-quality eyes with customizable elements tailored exactly to my needs. As a result of all these steps, Stitch ended up with a symmetrical, optimized mesh, asymmetrical ears, and the body split across two UDIMs, one for the main body and one for the additional parts.TexturingWhen planning Stitch's texturing, I understood that the main body texture would be fairly simple, with much of the visual detail enhanced by the fur. However, there were some areas that required much more attention than the rest of the body. The textures for Stitch can be roughly divided into several main parts:The base body, which includes the primary color of his fur, along with additional shading like a lighter tone on the front (belly) and a darker tone on the back and napeThe nose and ears, these zones, demanded separate focusAt the initial texturing/blocking stage, the ears looked too cartoony, which didn’t fit the style I wanted. So, I decided to push them towards a more realistic look. This involved removing bright colors, adding more variation in the roughness map, introducing variation in the base color, and making the ears visually more natural, layered, and textured on the surface. By combining smart materials and masks, I achieved the effect of "living" ears, slightly dirty and looking as natural as possible.The nose was a separate story. It occupies a significant part of the face and thus draws a lot of attention. While studying references, I noticed that the shape and texture of the nose vary a lot between different artists. Initially, I made it dog-like, with some wear and tear around the nostrils and base.For a long time, I thought this version was acceptable. But during test renders, I realized the nose needed improvement. So I reworked its texturing, aiming to make it more detailed. I divided the nose texture into four main layers:Base detail: Baked from the high-poly model. Over this, I applied a smart skin material that added characteristic bumps.Lighter layer: Applied via a mask using the AO channel. This darkened the crevices and brightened the bumps, creating a multi-layered effect.Organic detail (capillaries): In animal references, I noticed slight redness in the nose area. I created another AO-masked layer with reddish capillaries visible through the bumps, adding depth and realism.Softness: To make the nose visually softer, like in references, I added a fill layer with only height enabled, used a paper texture as grayscale, and applied a blurred mask. This created subtle dents and wrinkles that softened the look.All textures were created in 4K resolution to achieve maximum detail. After finishing the main texturing stage, I add an Ambient Occlusion map on the final texture layer, activating only the Color channel, setting the blend mode to Multiply, and reducing opacity to about 35%. This adds volume and greatly improves the overall perception of the model.That covers the texturing of Stitch’s body. I also created a separate texture for the fur. This was simpler, I disabled unnecessary layers like ears and eyelids, and left only the base ones corresponding to the body’s color tones.During grooming (which I'll cover in detail later), I also created textures for the fur's clamps and roughness. In Substance 3D Painter, I additionally painted masks for better fur detail.FurAnd finally, I moved on to the part that was most important to me, the very reason I started this project in the first place. Fur. This entire process was essentially a test of my fur grooming skills. After overcoming self-doubt, I trusted the process and relied on everything I had learned so far. Before diving into the grooming itself, I made sure to gather strong references. I searched for the highest quality and most inspiring examples I could find and analyzed them thoroughly. My goal was to clearly understand the direction of fur growth, its density and volume, the intensity of roughness, and the strength of clumping in different areas of Stitch's body.To create the fur, I used Blender and its Hair Particle System. The overall approach is similar to sculpting a high-detail model: work from broad strokes to finer details. So, the first step was blocking out the main flow and placement of the hair strands.At this point, I ran into a challenge: symmetry. Since the model was purposefully asymmetrical (because of the ears and skin folds), the fur couldn't be mirrored cleanly. To solve this, I created a base fur blocking using Hair Guides with just two segments. After that, I split the fur into separate parts. I duplicated the main Particle System and created individual hair systems for each area where needed.In total, I broke Stitch's body into key sections: head, left ear, right ear, front torso, back torso, arms, hands, upper and lower legs, toes, and additional detailing layers. The final fur setup included 25 separate particle systems.To control fur growth, I used Weight Paint to fine-tune the influence on each body part individually. This separation gave me much more precision and allowed full control over every parameter of the fur on a per-section basis.The most challenging aspect of working with fur is staying patient and focused. Detail is absolutely critical because the overall picture is built entirely from tiny, subtle elements. Once the base layer was complete, I moved on to refining the fur based on my references.The most complex areas turned out to be the front of the torso and the face. When working on the torso, my goal was to create a smooth gradient, from thick, clumped fur on the chest to shorter, softer fur on the stomach.Step by step, I adjusted the transitions, directions, clumps, and volumes to achieve that look. Additionally, I used the fur itself to subtly enhance Stitch's silhouette, making his overall shape feel sharper, more expressive, and visually engaging.During fur development, I used texture maps to control the intensity of the Roughness and Clump parameters. This gave me a high degree of flexibility, textures drove these attributes across the entire model. In areas where stronger clumping or roughness was needed, I used brighter values; in zones requiring a softer look, darker values. This approach allowed for fine-tuned micro-level control of the fur shader and helped achieve a highly realistic appearance in renders.The face required special attention: the fur had to be neat, evenly distributed, and still visually appealing. The biggest challenge here was working around the eye area. Even with properly adjusted Weight Paint, interpolation sometimes caused strands to creep into the eyes.I spent a lot of time cleaning up this region to get an optimal result. I also had to revisit certain patches that looked bald, even though interpolation and weight painting were set correctly, because the fur didn't render properly there. These areas needed manual fixing.As part of the detailing stage, I also increased the number of segments in the Hair Guides.While the blocking phase only used two segments, I went up to three, and in some cases even five, for more complex regions. This gave me much more control over fur shape and flow.The tiniest details really matter, so I added extra fur layers with thinner, more chaotic strands extending slightly beyond the main silhouette. These micro-layers significantly improved the texture depth and boosted the overall realism.Aside from the grooming itself, I paid special attention to the fur material setup, as the shader plays a critical role in the final visual quality of the render. It's not enough to simply plug a color texture into a Principled BSDF node and call it done.I built a more complex shader, giving me precise control over various attributes. For example, I implemented subtle color variation across individual strands, along with darkening near the roots and a gradual brightening toward the tips. This helped add visual depth and made the fur look significantly more natural and lifelike.Working on the fur took up nearly half of the total time I spent on the entire model. And I'm genuinely happy with the result, this stage confirmed that the training I've gone through was solid and that I’m heading in the right direction with my artistic development.Rigging, Posing & SceneOnce I finished working on the fur, I rendered several 4K test shots from different angles to make sure every detail looked the way I intended. When I was fully satisfied with the results, it was time to move on to rigging.I divided the rigging process into three main parts:Body rig, for posing and positioning the characterFacial rig, for expressions and emotionsEar rig, for dynamic ear controlRigging isn't something I consider my strongest skill, but as a 3D generalist, I had to dive into many technical aspects of it. For the ears, I set up a relatively simple system with several bones connected using inverse kinematics (IK). This gave me flexible and intuitive control during posing and allowed for the addition of dynamic movement in animation.For facial rigging, I used the FaceIt add-on, which generates a complete facial control system for mouth, eyes, and tongue. It sped up the process significantly and gave me more precision. For the body, I used the ActorCore Rig by NVIDIA, then converted it to Rigify, which gave me a familiar interface and flexible control over poses.Posing is one of my favorite stages, it's when the character really comes to life. As usual, it started with gathering references. Honestly, it was hard to pick the final poses, Stitch is so expressive and full of personality that I wanted to try hundreds of them. But I focused on those that best conveyed the spirit and mood of the character. Some poses I reworked to fit my style rather than copying directly. For example, in the pose where Stitch licks his nose, I added drool and a bit of "green slime" for comedic effect. To capture motion, I tilted his head back and made the ears fly upward, creating a vivid, emotional snapshot.Just like in sculpting or grooming, minor details make a big difference in posing. Examples include: a slight asymmetry in the facial expression, a raised corner of the mouth, one eye squinting a little more than the other, and ears set at slightly different angles.These are subtle things that might not be noticed immediately, but they’re the key to making the character feel alive and believable.For each pose, I created a separate scene and collection in Blender, including the character, specific lighting setup, and a simple background or environment. This made it easy to return to any scene later, to adjust lighting, reposition the character, or tweak the background.In one of the renders, which I used as the cover image, Stitch is holding a little frog.I want to clearly note that the 3D model of the frog is not mine, full credit goes to the original author of the asset.At first, I wanted to build a full environment around Stitch, to create a scene that would feel like a frame from a film. But after carefully evaluating my skills and priorities, I decided that a weak environment would only detract from the strength of the character. So I opted for a simple, neutral backdrop, designed to keep all the focus on Stitch himself.Rendering, Lighting & Post-ProcessingWhen the character is complete, posed expressively, and integrated into the scene, there's one final step: lighting. Lighting isn't just a technical element of the scene — it’s a full-fledged stage of the 3D pipeline. It doesn't just illuminate; it paints. Proper lighting can highlight the personality of the character, emphasize forms, and create atmosphere.For all my renders, I rely on the classic three-point lighting setup: Key Light, Fill Light, and Rim Light.While this setup is well-known, it remains highly effective. When done thoughtfully, with the right intensity, direction, and color temperature, it creates a strong light-shadow composition that brings the model to life. In addition to the three main lights, I also use an HDRI map, but with very low intensity, around 0.3, just enough to subtly enrich the ambient light without overpowering the scene.Once everything is set, it's time to hit Render and wait for the result. Due to hardware limitations, I wasn’t able to produce full animated shots with fur. Rendering a single 4K image with fur took over an hour, so I limited myself to a 360° turnaround and several static renders.I don't spend too much time on post-processing, just basic refinements in Photoshop. Slight enhancement of the composition, gentle shadow adjustments, color balance tweaks, and adding a logo. Everything is done subtly, nothing overprocessed. The goal is simply to support and enhance what’s already there.Final ThoughtsThis project has been an incredible experience. Although it was my second time creating Stitch (the first was back in 2023), this time the process felt completely different at every stage. And honestly, it wasn't easy.But that was exactly the point: to challenge myself. To reimagine something familiar, to try things I'd never done before, and to walk the full journey from start to finish. The fur, the heart of this project, was especially meaningful to me. It’s what started it all. I poured a lot into this model: time, effort, emotion, and even doubts. But at the same time, I brought all my knowledge, skills, and experience into it.This work became a mirror of my progress from 2023 to 2025. I can clearly see how far I've come, and that gives me the motivation to keep going. Every hour of learning and practice paid off, the results speak for themselves. This model was created for my portfolio. I don't plan to use it commercially, unless, of course, a studio actually wants to license it for a new film (in that case, I'd be more than happy!)It's been a long road: challenging, sometimes exhausting, but above all inspiring and exciting. I know there's still a lot to learn. Many things to study, improve, and polish to perfection. But I'm already on that path, and I'm not stopping.Oleh Yakushev, 3D Character ArtistInterview conducted by Gloria Levine
    Like
    Love
    Wow
    Sad
    Angry
    574
    · 2 Commentaires ·0 Parts
  • Gaming Meets Streaming: Inside the Shift

    After a long, busy day, you boot up your gaming device but don’t quite feel like diving into an intense session. Instead, you open a broadcast of one of your favorite streamers and spend the evening laughing at commentary, reacting to unexpected moments, and just enjoying your time with fellow gamers. Sounds familiar?This everyday scenario perfectly captures the way live streaming platforms like Twitch, YouTube Gaming, or Kick have transformed the gaming experience — turning gameplay into shared moments where gamers broadcast in real-time while viewers watch, chat, learn, and discover new titles.What started as friends sharing gameplay clips has exploded into a multi-billion-dollar ecosystem where streamers are popular creators, viewers build communities around shared experiences, and watching games has become as popular as playing them. But how did streaming become such a powerful force in gaming – and what does it mean for players, creators, and the industry alike? Let’s find out!Why Do Gamers Love Streaming?So why are millions of gamers spending hours every week watching others play instead of jumping into a game themselves? The answer isn’t just one thing – it’s a mix of entertainment, learning, connection, and discovery that makes live streaming uniquely compelling. Let’s break it down.Entertainment at Your Own PaceSometimes, you just want to relax. Maybe you’re too mentally drained to queue up for ranked matches or start that complex RPG quest. Streaming offers the perfect low-effort alternative – the fun of gaming without needing to press a single button. Whether it's high-stakes gameplay, hilarious commentary, or unpredictable in-game chaos, streams let you enjoy all the excitement while kicking back on the couch, grabbing a snack, or chatting in the background.Learning and Skill DevelopmentStreaming isn’t just for laughs – it’s also one of the best ways to level up your own gameplay. Watching a skilled streamer handle a tricky boss fight, execute high-level strategies, or master a game’s mechanics can teach you far more than a dry tutorial ever could. Many gamers tune in specifically to study routes, tactics, builds, or even to understand if a game suits their playstyle before buying it. Think of it as education, but way more fun.Social Connection and CommunityOne of the most powerful draws of live streaming is the sense of community. Jumping into a stream isn’t like watching TV – it’s like entering a room full of people who love the same games you do. Chatting with fellow viewers, sharing reactions in real-time, tossing emotes into the chaos, and getting shoutouts from the streamer – it all creates a sense of belonging. For many, it’s a go-to social space where friendships, inside jokes, and even fandoms grow.Discovery of New Games and TrendsEver found a game you now love just because you saw a streamer play it? You’re not alone. Streaming has become a major discovery engine in gaming. Watching creators try new releases, revisit cult classics, or spotlight lesser-known indies helps players find titles they might never encounter on their own. Sometimes, entire genres or games blow up because of a few well-timed streams.Together, these draws have sparked a whole new kind of culture – gaming communities with their own languages, celebrities, and shared rituals.Inside Streaming CultureStreaming has created something unique in gaming: genuine relationships between creators and audiences who've never met. When Asmongold reacts to the latest releases or penguinz0 delivers his signature deadpan commentary, millions of viewers don't just watch – they feel like they're hanging out with a friend. These streamers have become trusted voices whose opinions carry real weight, making gaming fame more accessible than ever. Anyone with personality and dedication can build a loyal following and become a cultural influencer.If you've ever watched a Twitch stream, you've witnessed chat culture in action – a chaotic river of emotes, inside jokes, and reactions that somehow make perfect sense to regulars. "KEKW" expresses laughter, "Poggers" shows excitement, and memes spread like wildfire across communities. The chat itself becomes entertainment, with viewers competing to land the perfect reaction at just the right moment. These expressions often escape their stream origins, becoming part of the broader gaming vocabulary.For many viewers, streams have become part of their daily routine – tuning in at the same time, celebrating milestones, or witnessing historic gaming moments together. When a streamer finally beats that impossible boss, the entire community shares in the victory. These aren't just individual entertainment experiences — they're collective memories where thousands can say "I was there when it happened," creating communities that extend far beyond gaming itself.How Streamers Are Reshaping the Gaming IndustryWhile players tune in for fun and connection, behind the scenes, streaming is quietly reshaping how the gaming industry approaches everything from marketing to game design. What started as casual gameplay broadcasts is now influencing major decisions across studios and publishers.The New Marketing Powerhouse. Traditional game reviews and advertising have taken a backseat to streamer influence. A single popular creator playing your game can generate millions of views and drive massive sales overnight – just look at how Among Us exploded after a few key streamers discovered it, or how Fall Guys became a phenomenon through streaming momentum. Publishers now prioritize getting their games into the hands of influential streamers on launch day, knowing that authentic gameplay footage and reactions carry more weight than any trailer or review. Day-one streaming success has become make-or-break for many titles.Designing for the Stream. Developers are now creating games with streaming in mind. Modern titles include built-in streaming tools, spectator-friendly interfaces, and features that encourage viewer interaction like chat integration and voting systems. Games are designed to be visually clear and exciting to watch, not just play. Some developers even create "streamer modes" that remove copyrighted music or add special features for streamers. The rise of streaming has birthed entirely new genres — party games, reaction-heavy horror titles, and social deduction games all thrive because they're inherently entertaining to watch.The Creator Economy Boom. Streaming has created entirely new career paths and revenue streams within gaming. Successful streamers earn through donations, subscriptions, brand partnerships, and revenue sharing from platform-specific features like Twitch bits or YouTube Super Chat. This has spawned a massive creator economy where top streamers command six-figure sponsorship deals, while publishers allocate significant budgets to influencer partnerships rather than traditional advertising. The rise of streaming has also fueled the growth of esports, where pro players double as entertainers – drawing massive online audiences and blurring the line between competition and content.Video Game Streaming in NumbersWhile it’s easy to feel the impact of streaming in daily gaming life, the numbers behind the trend tell an even more powerful story. From billions in revenue to global shifts in viewer behavior, game streaming has grown into a massive industry reshaping how we play, watch, and connect. Here’s a look at the data driving the movement.Market Size & GrowthIn 2025, the global Games Live Streaming market is projected to generate billion in revenue. By 2030, that figure is expected to reach billion, growing at an annual rate of 4.32%.The average revenue per userin 2025 stands at showing consistent monetization across platforms.China remains the single largest market, expected to bring in billion this year alone.
    #gaming #meets #streaming #inside #shift
    Gaming Meets Streaming: Inside the Shift
    After a long, busy day, you boot up your gaming device but don’t quite feel like diving into an intense session. Instead, you open a broadcast of one of your favorite streamers and spend the evening laughing at commentary, reacting to unexpected moments, and just enjoying your time with fellow gamers. Sounds familiar?This everyday scenario perfectly captures the way live streaming platforms like Twitch, YouTube Gaming, or Kick have transformed the gaming experience — turning gameplay into shared moments where gamers broadcast in real-time while viewers watch, chat, learn, and discover new titles.What started as friends sharing gameplay clips has exploded into a multi-billion-dollar ecosystem where streamers are popular creators, viewers build communities around shared experiences, and watching games has become as popular as playing them. But how did streaming become such a powerful force in gaming – and what does it mean for players, creators, and the industry alike? Let’s find out!Why Do Gamers Love Streaming?So why are millions of gamers spending hours every week watching others play instead of jumping into a game themselves? The answer isn’t just one thing – it’s a mix of entertainment, learning, connection, and discovery that makes live streaming uniquely compelling. Let’s break it down.Entertainment at Your Own PaceSometimes, you just want to relax. Maybe you’re too mentally drained to queue up for ranked matches or start that complex RPG quest. Streaming offers the perfect low-effort alternative – the fun of gaming without needing to press a single button. Whether it's high-stakes gameplay, hilarious commentary, or unpredictable in-game chaos, streams let you enjoy all the excitement while kicking back on the couch, grabbing a snack, or chatting in the background.Learning and Skill DevelopmentStreaming isn’t just for laughs – it’s also one of the best ways to level up your own gameplay. Watching a skilled streamer handle a tricky boss fight, execute high-level strategies, or master a game’s mechanics can teach you far more than a dry tutorial ever could. Many gamers tune in specifically to study routes, tactics, builds, or even to understand if a game suits their playstyle before buying it. Think of it as education, but way more fun.Social Connection and CommunityOne of the most powerful draws of live streaming is the sense of community. Jumping into a stream isn’t like watching TV – it’s like entering a room full of people who love the same games you do. Chatting with fellow viewers, sharing reactions in real-time, tossing emotes into the chaos, and getting shoutouts from the streamer – it all creates a sense of belonging. For many, it’s a go-to social space where friendships, inside jokes, and even fandoms grow.Discovery of New Games and TrendsEver found a game you now love just because you saw a streamer play it? You’re not alone. Streaming has become a major discovery engine in gaming. Watching creators try new releases, revisit cult classics, or spotlight lesser-known indies helps players find titles they might never encounter on their own. Sometimes, entire genres or games blow up because of a few well-timed streams.Together, these draws have sparked a whole new kind of culture – gaming communities with their own languages, celebrities, and shared rituals.Inside Streaming CultureStreaming has created something unique in gaming: genuine relationships between creators and audiences who've never met. When Asmongold reacts to the latest releases or penguinz0 delivers his signature deadpan commentary, millions of viewers don't just watch – they feel like they're hanging out with a friend. These streamers have become trusted voices whose opinions carry real weight, making gaming fame more accessible than ever. Anyone with personality and dedication can build a loyal following and become a cultural influencer.If you've ever watched a Twitch stream, you've witnessed chat culture in action – a chaotic river of emotes, inside jokes, and reactions that somehow make perfect sense to regulars. "KEKW" expresses laughter, "Poggers" shows excitement, and memes spread like wildfire across communities. The chat itself becomes entertainment, with viewers competing to land the perfect reaction at just the right moment. These expressions often escape their stream origins, becoming part of the broader gaming vocabulary.For many viewers, streams have become part of their daily routine – tuning in at the same time, celebrating milestones, or witnessing historic gaming moments together. When a streamer finally beats that impossible boss, the entire community shares in the victory. These aren't just individual entertainment experiences — they're collective memories where thousands can say "I was there when it happened," creating communities that extend far beyond gaming itself.How Streamers Are Reshaping the Gaming IndustryWhile players tune in for fun and connection, behind the scenes, streaming is quietly reshaping how the gaming industry approaches everything from marketing to game design. What started as casual gameplay broadcasts is now influencing major decisions across studios and publishers.The New Marketing Powerhouse. Traditional game reviews and advertising have taken a backseat to streamer influence. A single popular creator playing your game can generate millions of views and drive massive sales overnight – just look at how Among Us exploded after a few key streamers discovered it, or how Fall Guys became a phenomenon through streaming momentum. Publishers now prioritize getting their games into the hands of influential streamers on launch day, knowing that authentic gameplay footage and reactions carry more weight than any trailer or review. Day-one streaming success has become make-or-break for many titles.Designing for the Stream. Developers are now creating games with streaming in mind. Modern titles include built-in streaming tools, spectator-friendly interfaces, and features that encourage viewer interaction like chat integration and voting systems. Games are designed to be visually clear and exciting to watch, not just play. Some developers even create "streamer modes" that remove copyrighted music or add special features for streamers. The rise of streaming has birthed entirely new genres — party games, reaction-heavy horror titles, and social deduction games all thrive because they're inherently entertaining to watch.The Creator Economy Boom. Streaming has created entirely new career paths and revenue streams within gaming. Successful streamers earn through donations, subscriptions, brand partnerships, and revenue sharing from platform-specific features like Twitch bits or YouTube Super Chat. This has spawned a massive creator economy where top streamers command six-figure sponsorship deals, while publishers allocate significant budgets to influencer partnerships rather than traditional advertising. The rise of streaming has also fueled the growth of esports, where pro players double as entertainers – drawing massive online audiences and blurring the line between competition and content.Video Game Streaming in NumbersWhile it’s easy to feel the impact of streaming in daily gaming life, the numbers behind the trend tell an even more powerful story. From billions in revenue to global shifts in viewer behavior, game streaming has grown into a massive industry reshaping how we play, watch, and connect. Here’s a look at the data driving the movement.Market Size & GrowthIn 2025, the global Games Live Streaming market is projected to generate billion in revenue. By 2030, that figure is expected to reach billion, growing at an annual rate of 4.32%.The average revenue per userin 2025 stands at showing consistent monetization across platforms.China remains the single largest market, expected to bring in billion this year alone. #gaming #meets #streaming #inside #shift
    Gaming Meets Streaming: Inside the Shift
    80.lv
    After a long, busy day, you boot up your gaming device but don’t quite feel like diving into an intense session. Instead, you open a broadcast of one of your favorite streamers and spend the evening laughing at commentary, reacting to unexpected moments, and just enjoying your time with fellow gamers. Sounds familiar?This everyday scenario perfectly captures the way live streaming platforms like Twitch, YouTube Gaming, or Kick have transformed the gaming experience — turning gameplay into shared moments where gamers broadcast in real-time while viewers watch, chat, learn, and discover new titles.What started as friends sharing gameplay clips has exploded into a multi-billion-dollar ecosystem where streamers are popular creators, viewers build communities around shared experiences, and watching games has become as popular as playing them. But how did streaming become such a powerful force in gaming – and what does it mean for players, creators, and the industry alike? Let’s find out!Why Do Gamers Love Streaming?So why are millions of gamers spending hours every week watching others play instead of jumping into a game themselves? The answer isn’t just one thing – it’s a mix of entertainment, learning, connection, and discovery that makes live streaming uniquely compelling. Let’s break it down.Entertainment at Your Own PaceSometimes, you just want to relax. Maybe you’re too mentally drained to queue up for ranked matches or start that complex RPG quest. Streaming offers the perfect low-effort alternative – the fun of gaming without needing to press a single button. Whether it's high-stakes gameplay, hilarious commentary, or unpredictable in-game chaos, streams let you enjoy all the excitement while kicking back on the couch, grabbing a snack, or chatting in the background.Learning and Skill DevelopmentStreaming isn’t just for laughs – it’s also one of the best ways to level up your own gameplay. Watching a skilled streamer handle a tricky boss fight, execute high-level strategies, or master a game’s mechanics can teach you far more than a dry tutorial ever could. Many gamers tune in specifically to study routes, tactics, builds, or even to understand if a game suits their playstyle before buying it. Think of it as education, but way more fun.Social Connection and CommunityOne of the most powerful draws of live streaming is the sense of community. Jumping into a stream isn’t like watching TV – it’s like entering a room full of people who love the same games you do. Chatting with fellow viewers, sharing reactions in real-time, tossing emotes into the chaos, and getting shoutouts from the streamer – it all creates a sense of belonging. For many, it’s a go-to social space where friendships, inside jokes, and even fandoms grow.Discovery of New Games and TrendsEver found a game you now love just because you saw a streamer play it? You’re not alone. Streaming has become a major discovery engine in gaming. Watching creators try new releases, revisit cult classics, or spotlight lesser-known indies helps players find titles they might never encounter on their own. Sometimes, entire genres or games blow up because of a few well-timed streams (Among Us, Vampire Survivors, Only Up! – all made big by streamers).Together, these draws have sparked a whole new kind of culture – gaming communities with their own languages, celebrities, and shared rituals.Inside Streaming CultureStreaming has created something unique in gaming: genuine relationships between creators and audiences who've never met. When Asmongold reacts to the latest releases or penguinz0 delivers his signature deadpan commentary, millions of viewers don't just watch – they feel like they're hanging out with a friend. These streamers have become trusted voices whose opinions carry real weight, making gaming fame more accessible than ever. Anyone with personality and dedication can build a loyal following and become a cultural influencer.If you've ever watched a Twitch stream, you've witnessed chat culture in action – a chaotic river of emotes, inside jokes, and reactions that somehow make perfect sense to regulars. "KEKW" expresses laughter, "Poggers" shows excitement, and memes spread like wildfire across communities. The chat itself becomes entertainment, with viewers competing to land the perfect reaction at just the right moment. These expressions often escape their stream origins, becoming part of the broader gaming vocabulary.For many viewers, streams have become part of their daily routine – tuning in at the same time, celebrating milestones, or witnessing historic gaming moments together. When a streamer finally beats that impossible boss, the entire community shares in the victory. These aren't just individual entertainment experiences — they're collective memories where thousands can say "I was there when it happened," creating communities that extend far beyond gaming itself.How Streamers Are Reshaping the Gaming IndustryWhile players tune in for fun and connection, behind the scenes, streaming is quietly reshaping how the gaming industry approaches everything from marketing to game design. What started as casual gameplay broadcasts is now influencing major decisions across studios and publishers.The New Marketing Powerhouse. Traditional game reviews and advertising have taken a backseat to streamer influence. A single popular creator playing your game can generate millions of views and drive massive sales overnight – just look at how Among Us exploded after a few key streamers discovered it, or how Fall Guys became a phenomenon through streaming momentum. Publishers now prioritize getting their games into the hands of influential streamers on launch day, knowing that authentic gameplay footage and reactions carry more weight than any trailer or review. Day-one streaming success has become make-or-break for many titles.Designing for the Stream. Developers are now creating games with streaming in mind. Modern titles include built-in streaming tools, spectator-friendly interfaces, and features that encourage viewer interaction like chat integration and voting systems. Games are designed to be visually clear and exciting to watch, not just play. Some developers even create "streamer modes" that remove copyrighted music or add special features for streamers. The rise of streaming has birthed entirely new genres — party games, reaction-heavy horror titles, and social deduction games all thrive because they're inherently entertaining to watch.The Creator Economy Boom. Streaming has created entirely new career paths and revenue streams within gaming. Successful streamers earn through donations, subscriptions, brand partnerships, and revenue sharing from platform-specific features like Twitch bits or YouTube Super Chat. This has spawned a massive creator economy where top streamers command six-figure sponsorship deals, while publishers allocate significant budgets to influencer partnerships rather than traditional advertising. The rise of streaming has also fueled the growth of esports, where pro players double as entertainers – drawing massive online audiences and blurring the line between competition and content.Video Game Streaming in NumbersWhile it’s easy to feel the impact of streaming in daily gaming life, the numbers behind the trend tell an even more powerful story. From billions in revenue to global shifts in viewer behavior, game streaming has grown into a massive industry reshaping how we play, watch, and connect. Here’s a look at the data driving the movement.Market Size & GrowthIn 2025, the global Games Live Streaming market is projected to generate $15.32 billion in revenue. By 2030, that figure is expected to reach $18.92 billion, growing at an annual rate of 4.32%.The average revenue per user (ARPU) in 2025 stands at $10.51, showing consistent monetization across platforms.China remains the single largest market, expected to bring in $2.92 billion this year alone.Source: Statista Market Insights, 2025Viewership & Daily HabitsThe number of users in the live game streaming market is forecast to hit 1.8 billion by 2030, with user penetration rising from 18.6% in 2025 to 22.6% by the end of the decade.In 2023, average daily time spent watching game streams rose to 2.5 hours per user, up 12% year-over-year — a clear sign of streaming becoming part of gamers’ daily routines.Sources: Statista Market Insights, 2025; SNS Insider, 2024What People Are WatchingThe most-watched games on Twitch include League of Legends, GTA V, and Counter-Strike — all regularly topping charts for both viewers and streamers.When it comes to creators, the most-streamed games are Fortnite, Valorant, and Call of Duty: Warzone, showing a strong overlap between what streamers love to broadcast and what audiences enjoy watching.In Q1 2024, Twitch users spent over 249 million hours watching new game releases, while total gaming-related content reached around 3.3 billion hours.Sources: SullyGnome, 2025; Statista, 2025Global Trends & Regional PlatformsChina’s local platforms like Huya (31M MAU) and Douyu (26.6M MAU) remain key players in the domestic market.In South Korea, following Twitch’s 2023 exit, local services like AfreecaTV and newcomer Chzzk have positioned themselves as alternatives.Meanwhile, Japan and Europe continue to see steady engagement driven by strong gaming scenes and dedicated fan communities.Source: Statista, 2025Event Livestreaming Hits New HighsNintendo Direct was the most-watched gaming showcase in 2024, with an average minute audience of 2.6 million.The 2024 Streamer Awards drew over 645,000 peak viewers, highlighting how creator-focused events now rival traditional game showcases.Source: Statista, 2025As game streaming continues to evolve, its role in the broader gaming ecosystem is becoming clearer. It hasn’t replaced traditional gameplay – instead, it’s added a new dimension to how people engage with games, offering a space for connection, discovery, and commentary. For players, creators, and industry leaders alike, streaming now sits alongside playing as a core part of the modern gaming experience – one that continues to grow and shift with the industry itself.
    Like
    Love
    Wow
    Sad
    Angry
    615
    · 2 Commentaires ·0 Parts
  • واش راكم يا جماعة؟ حبيت نشارك معاكم واحد الموضوع العجيب!

    سمعتوا بلي ريش الطاووس يقدر يخرج أشعة ليزر؟! بالله عليكم، كيفاش طبيعة عندها هاد القدرات الخارقة؟ هاد الخبر جا من علماء للجزيرة نت، وفعلاً يخلينا نفكروا في جمال وعبقرية الطبيعة.

    أنا شخصياً، كي نشوف الطاووس، نحب نتأمل في ألوانه وزينته، لكن ما كنتش نتخيل أنه عنده مواهب كيما هادي. سبحان الله، الطبيعة فيها الكثير من الأسرار.

    خلونا نفتحوا عقولنا ونكتشفوا المزيد من هاد العالم الرائع.

    https://news.google.com/rss/articles/CBMi3wJBVV95cUxNM2xuOUFlNWpOeFF4c0YtLUpiQ29CZm1KanhuVXFoOEJlVF9ua2lGYUpmaGQwS3dqekR2NTUxbUNKMGlPR3FPeWhyYm9SS1VlcnRCUS1DdW
    واش راكم يا جماعة؟ 😄 حبيت نشارك معاكم واحد الموضوع العجيب! سمعتوا بلي ريش الطاووس يقدر يخرج أشعة ليزر؟! بالله عليكم، كيفاش طبيعة عندها هاد القدرات الخارقة؟ 🤯 هاد الخبر جا من علماء للجزيرة نت، وفعلاً يخلينا نفكروا في جمال وعبقرية الطبيعة. أنا شخصياً، كي نشوف الطاووس، نحب نتأمل في ألوانه وزينته، لكن ما كنتش نتخيل أنه عنده مواهب كيما هادي. سبحان الله، الطبيعة فيها الكثير من الأسرار. خلونا نفتحوا عقولنا ونكتشفوا المزيد من هاد العالم الرائع. https://news.google.com/rss/articles/CBMi3wJBVV95cUxNM2xuOUFlNWpOeFF4c0YtLUpiQ29CZm1KanhuVXFoOEJlVF9ua2lGYUpmaGQwS3dqekR2NTUxbUNKMGlPR3FPeWhyYm9SS1VlcnRCUS1DdW
    news.google.com
    علماء للجزيرة نت: ريش الطاووس يصدر أشعة الليزر  الجزيرة نت
    Like
    Love
    Wow
    Sad
    Angry
    302
    · 1 Commentaires ·0 Parts
  • Metal Gear Solid Delta: Snake Eater developer interview

    Metal Gear Solid Delta: Snake Eater, launching August 28 on PlayStation 5, is a remake of the 2004 PlayStation 2 classic, Metal Gear Solid 3: Snake Eater. I had a conversation with the developers during a Tokyo press event to discuss the upcoming remake and its development process. 

    ​​

    Faithfully replicating the thrill and impact of the original

    PlayStation Blog: How important was it to your team to create a game that stayed true to the original?

    Noriaki Okamura: We began this project with the intention of bringing a 20-year-old game to the present day. While we updated the graphics and certain game mechanics to ensure today’s players could fully enjoy the experience, we wanted to stay true to the original as much as possible.

    What challenges did your team face during development, and what specific adjustments were implemented?

    Okamura: I had no intention of altering the original story, so I insisted that we can just update the game graphics. Korekado disagreed and warned me that that approach will not work, but I initially had the team re-create the game just with new character models. Although the graphics improved, they appeared doll-like and unrealistic, so I finally realized that my plan was inadequate.

    Yuji Korekado: We began by reworking the animation and game mechanics. We implemented animation programming that didn’t exist two decades ago to make the game more realistic, but that also meant we couldn’t reproduce the original game mechanics. Metal Gear is a stealth game, so it’s crucial for players to be able to make precise movements. We put in a lot of effort to replicate the same feel as the original, while maintaining realism.

    Are there any areas of the game that you wanted to recreate as faithfully as possible?

    Korekado: We made sure that the jungle looked as realistic as possible. We devoted a lot of time modeling every fine detail like leaves, grass, and moss covering the ground. Since the perspective shifts along with the character’s movements, players will get a closer look at the ground when they’re crouching or crawling. To make sure the environment was immersive from every angle, we carefully crafted every element with great precision.

    ​​

    Have any enhancements been made compared to the original PS2 version?

    Korekado: We enhanced the visuals to be more intuitive. Thanks to increased memory and much faster speeds the user experience has improved significantly, including faster transitions to the Survival Viewer or having a quick menu to swap uniforms. On top of that, the audio improvements are remarkable. Sound absorption rates vary depending on the materials of the walls and floors, which allows players to detect enemies behind walls or nearby animals intuitively. In areas like caves and long corridors, unique echo parameters help distinguish different environments, which I think is a major advancement for stealth gameplay.

    Extra content for players to enjoy diverse gameplay

    The remake features Fox Hunt, a new online multiplayer mode. Why did you include this in the game instead of Metal Gear Online?

    Yu Sahara: The remake features significantly enhanced graphics, so we explored various online modes that aligned with these improvements. We decided to focus on stealth, sneaking, and survival, since those are also the key pillars of the main game. We landed on a concept that is based on hide-and-seek, that is classic Metal Gear, while also being reminiscent of the stealth missions featured in the earlier MGO.

    Can players earn rewards by playing the Fox Hunt mode?

    Sahara: While there are no items that can be transferred to the main game, players can unlock rewards like new camouflage options by playing Fox Hunt multiple times.

    Were there any challenges or specific areas of focus while remaking Snake vs Monkey mode?

    Taiga Ishigami: Our main goal was to make Pipo Monkey even more charming, cute, and entertaining. We developed new character actions, including the “Gotcha!” motion, and each animation and sound effect were carefully reviewed to ensure it captured Pipo Monkey’s personality. If anything felt off, we made changes right away.

    I heard the new Snake vs Monkey mode features an Astro Bot collab.

    Ishigami: Yes, a couple of bots from the Astro Bot game will make an appearance, and you can capture them just like the Pipo Monkeys. Capturing these bots isn’t required to finish the levels, but you’ll receive unique rewards if you do. Depending on the level, either a standard bot or a Pipo Monkey bot will be hidden away, so be sure to keep an eye out for them.

    ​​

    Do you have any final words for new players as well as longtime fans of the original game?

    Okamura: I rarely cry when playing games, but I remember bawling my eyes out while playing the original Metal Gear Solid 3. The development of Metal Gear Solid Delta: Snake Eater was driven by our goal to faithfully capture the impact and thrill that players felt two decades ago. Metal Gear Solid 3 is the ultimate example of storytelling in games, and having dreamed of making a game like this, I now feel a sense of fulfillment. I hope everyone enjoys the story as much as I do.

    Metal Gear Solid Delta: Snake Eater arrives on PS5 on August 28. 

    Read a new hands-on report with the game.
    #metal #gear #solid #delta #snake
    Metal Gear Solid Delta: Snake Eater developer interview
    Metal Gear Solid Delta: Snake Eater, launching August 28 on PlayStation 5, is a remake of the 2004 PlayStation 2 classic, Metal Gear Solid 3: Snake Eater. I had a conversation with the developers during a Tokyo press event to discuss the upcoming remake and its development process.  ​​ Faithfully replicating the thrill and impact of the original PlayStation Blog: How important was it to your team to create a game that stayed true to the original? Noriaki Okamura: We began this project with the intention of bringing a 20-year-old game to the present day. While we updated the graphics and certain game mechanics to ensure today’s players could fully enjoy the experience, we wanted to stay true to the original as much as possible. What challenges did your team face during development, and what specific adjustments were implemented? Okamura: I had no intention of altering the original story, so I insisted that we can just update the game graphics. Korekado disagreed and warned me that that approach will not work, but I initially had the team re-create the game just with new character models. Although the graphics improved, they appeared doll-like and unrealistic, so I finally realized that my plan was inadequate. Yuji Korekado: We began by reworking the animation and game mechanics. We implemented animation programming that didn’t exist two decades ago to make the game more realistic, but that also meant we couldn’t reproduce the original game mechanics. Metal Gear is a stealth game, so it’s crucial for players to be able to make precise movements. We put in a lot of effort to replicate the same feel as the original, while maintaining realism. Are there any areas of the game that you wanted to recreate as faithfully as possible? Korekado: We made sure that the jungle looked as realistic as possible. We devoted a lot of time modeling every fine detail like leaves, grass, and moss covering the ground. Since the perspective shifts along with the character’s movements, players will get a closer look at the ground when they’re crouching or crawling. To make sure the environment was immersive from every angle, we carefully crafted every element with great precision. ​​ Have any enhancements been made compared to the original PS2 version? Korekado: We enhanced the visuals to be more intuitive. Thanks to increased memory and much faster speeds the user experience has improved significantly, including faster transitions to the Survival Viewer or having a quick menu to swap uniforms. On top of that, the audio improvements are remarkable. Sound absorption rates vary depending on the materials of the walls and floors, which allows players to detect enemies behind walls or nearby animals intuitively. In areas like caves and long corridors, unique echo parameters help distinguish different environments, which I think is a major advancement for stealth gameplay. Extra content for players to enjoy diverse gameplay The remake features Fox Hunt, a new online multiplayer mode. Why did you include this in the game instead of Metal Gear Online? Yu Sahara: The remake features significantly enhanced graphics, so we explored various online modes that aligned with these improvements. We decided to focus on stealth, sneaking, and survival, since those are also the key pillars of the main game. We landed on a concept that is based on hide-and-seek, that is classic Metal Gear, while also being reminiscent of the stealth missions featured in the earlier MGO. Can players earn rewards by playing the Fox Hunt mode? Sahara: While there are no items that can be transferred to the main game, players can unlock rewards like new camouflage options by playing Fox Hunt multiple times. Were there any challenges or specific areas of focus while remaking Snake vs Monkey mode? Taiga Ishigami: Our main goal was to make Pipo Monkey even more charming, cute, and entertaining. We developed new character actions, including the “Gotcha!” motion, and each animation and sound effect were carefully reviewed to ensure it captured Pipo Monkey’s personality. If anything felt off, we made changes right away. I heard the new Snake vs Monkey mode features an Astro Bot collab. Ishigami: Yes, a couple of bots from the Astro Bot game will make an appearance, and you can capture them just like the Pipo Monkeys. Capturing these bots isn’t required to finish the levels, but you’ll receive unique rewards if you do. Depending on the level, either a standard bot or a Pipo Monkey bot will be hidden away, so be sure to keep an eye out for them. ​​ Do you have any final words for new players as well as longtime fans of the original game? Okamura: I rarely cry when playing games, but I remember bawling my eyes out while playing the original Metal Gear Solid 3. The development of Metal Gear Solid Delta: Snake Eater was driven by our goal to faithfully capture the impact and thrill that players felt two decades ago. Metal Gear Solid 3 is the ultimate example of storytelling in games, and having dreamed of making a game like this, I now feel a sense of fulfillment. I hope everyone enjoys the story as much as I do. Metal Gear Solid Delta: Snake Eater arrives on PS5 on August 28.  Read a new hands-on report with the game. #metal #gear #solid #delta #snake
    Metal Gear Solid Delta: Snake Eater developer interview
    blog.playstation.com
    Metal Gear Solid Delta: Snake Eater, launching August 28 on PlayStation 5, is a remake of the 2004 PlayStation 2 classic, Metal Gear Solid 3: Snake Eater. I had a conversation with the developers during a Tokyo press event to discuss the upcoming remake and its development process.  ​​ Faithfully replicating the thrill and impact of the original PlayStation Blog: How important was it to your team to create a game that stayed true to the original? Noriaki Okamura (Metal Gear Series Producer): We began this project with the intention of bringing a 20-year-old game to the present day. While we updated the graphics and certain game mechanics to ensure today’s players could fully enjoy the experience, we wanted to stay true to the original as much as possible. What challenges did your team face during development, and what specific adjustments were implemented? Okamura: I had no intention of altering the original story, so I insisted that we can just update the game graphics. Korekado disagreed and warned me that that approach will not work, but I initially had the team re-create the game just with new character models. Although the graphics improved, they appeared doll-like and unrealistic, so I finally realized that my plan was inadequate. Yuji Korekado (Creative Producer): We began by reworking the animation and game mechanics. We implemented animation programming that didn’t exist two decades ago to make the game more realistic, but that also meant we couldn’t reproduce the original game mechanics. Metal Gear is a stealth game, so it’s crucial for players to be able to make precise movements. We put in a lot of effort to replicate the same feel as the original, while maintaining realism. Are there any areas of the game that you wanted to recreate as faithfully as possible? Korekado: We made sure that the jungle looked as realistic as possible. We devoted a lot of time modeling every fine detail like leaves, grass, and moss covering the ground. Since the perspective shifts along with the character’s movements, players will get a closer look at the ground when they’re crouching or crawling. To make sure the environment was immersive from every angle, we carefully crafted every element with great precision. ​​ Have any enhancements been made compared to the original PS2 version? Korekado: We enhanced the visuals to be more intuitive. Thanks to increased memory and much faster speeds the user experience has improved significantly, including faster transitions to the Survival Viewer or having a quick menu to swap uniforms. On top of that, the audio improvements are remarkable. Sound absorption rates vary depending on the materials of the walls and floors, which allows players to detect enemies behind walls or nearby animals intuitively. In areas like caves and long corridors, unique echo parameters help distinguish different environments, which I think is a major advancement for stealth gameplay. Extra content for players to enjoy diverse gameplay The remake features Fox Hunt, a new online multiplayer mode. Why did you include this in the game instead of Metal Gear Online (MGO)? Yu Sahara (Fox Hunt Director): The remake features significantly enhanced graphics, so we explored various online modes that aligned with these improvements. We decided to focus on stealth, sneaking, and survival, since those are also the key pillars of the main game. We landed on a concept that is based on hide-and-seek, that is classic Metal Gear, while also being reminiscent of the stealth missions featured in the earlier MGO. Can players earn rewards by playing the Fox Hunt mode? Sahara: While there are no items that can be transferred to the main game, players can unlock rewards like new camouflage options by playing Fox Hunt multiple times. Were there any challenges or specific areas of focus while remaking Snake vs Monkey mode? Taiga Ishigami (Planner): Our main goal was to make Pipo Monkey even more charming, cute, and entertaining. We developed new character actions, including the “Gotcha!” motion, and each animation and sound effect were carefully reviewed to ensure it captured Pipo Monkey’s personality. If anything felt off, we made changes right away. I heard the new Snake vs Monkey mode features an Astro Bot collab. Ishigami: Yes, a couple of bots from the Astro Bot game will make an appearance, and you can capture them just like the Pipo Monkeys. Capturing these bots isn’t required to finish the levels, but you’ll receive unique rewards if you do. Depending on the level, either a standard bot or a Pipo Monkey bot will be hidden away, so be sure to keep an eye out for them. ​​ Do you have any final words for new players as well as longtime fans of the original game? Okamura: I rarely cry when playing games, but I remember bawling my eyes out while playing the original Metal Gear Solid 3. The development of Metal Gear Solid Delta: Snake Eater was driven by our goal to faithfully capture the impact and thrill that players felt two decades ago. Metal Gear Solid 3 is the ultimate example of storytelling in games, and having dreamed of making a game like this, I now feel a sense of fulfillment. I hope everyone enjoys the story as much as I do. Metal Gear Solid Delta: Snake Eater arrives on PS5 on August 28.  Read a new hands-on report with the game.
    Like
    Love
    Wow
    Sad
    Angry
    455
    · 2 Commentaires ·0 Parts
  • Evil Empire tells devs to avoid early access unless their project is 90 percent complete

    Chris Kerr, Senior Editor, News, GameDeveloper.comAugust 22, 20254 Min ReadVia Evil Empire/UbisoftThe Rogue Prince of Persia developer Evil Empire doesn't believe it's worth entering early access in the current climate unless your project is at least 90 percent complete. Speaking to Game Developer at Gamescom 2025, studio marketing manager Matthew Houghton and art director Dylan Eurlings shared their thoughts on the state of contemporary early access campaigns and suggested the system is becoming increasingly risky. You might recognize Evil Empire as the studio that has spent years helping Dead Cells developer Motion Twin expand the franchise with DLC and free updates. Since May 2024, however, the studio has also been shepherding The Rogue Prince of Persia through an Steam Early Access campaign that culminated in an official launch on August 20.Yet, due to a variety of factors including shifting player expectations, the team had to tweak their pre-release strategy in a bid to lure in players. As a result, the studio has become more wary of early access in general. "To be honest, the way I see it now is that unless you're coming into early access with a 90 percent complete game, don't do it. Because players, they don't see it as early access, they see it a game to play," says Houghton. He adds that players are entitled to express their views because they're often paying money to play early access titles, but he feels that sifting viewpoint means consumers are less willing to follow games on an early access journey that will often result in significant changes. Related:Eurlings echoes that point and suggests that a perceived lack of polish and content can now be a "huge issue" during early access campaigns. He explains that Evil Empire was even forced to rethink their original roadmap after the earliest versions of Rogue Prince of Persia failed to meet internal expectations."Initially we wanted to do quick updates. Very fast. In the end, we took a bit more time to ensure that each update would be a bit more chunky," he adds, noting that pivot resulted in a steady cadence of monthly updates that each packed a fair bit of clout. That shift came after the team conceded that early access numbers "weren't great." Houghton explains the project initially attracted just under 1,000 peak concurrent users. After they flipped the script, the title started to pull in around 4,000 CCU. "That's why we wanted to prioritize chunky content—there would be an impact every time," says Houghton, before acknowledging the title still hadn't quite met the team's CCU expectations even after that switch-up. Related:"Of course you listen to the feedback and take the ideas, but you have to feed them through a filter."It underlines the risks that come with early access, and Houghton specifically wonders whether some developers who can't hit that 90 percent completion benchmark before launching into early access might instead be better off conducting beta tests through Steam instead. If you're curious as to where Rogue Prince of Persia was when it entered Early Access, Houghton suggests the project was roughly 60 percent complete. Eurlings, was slightly more conservative, and claimed it was more like 50 percent. We suggested they meet in the middle. "Steam has evolved now. You can do beta tests and playtests through Steam. I think that's become more like Early Access. People aren't paying for it and your project might be a bit jankybut that's okay because they're still going to give feedback and it's not going to be a disaster if it's not great yet," continues Houghton. When asked whether Evil Empire would consider revisiting Early Access in the future, Houghton isn't so certain."I don't know. I'm going to be honest. I've been put off by it, because especially now people are so used to games coming out and then doing live ops for three, four, and five years—so why go with early access and have to deal with the stigma that's around it? I think I would do playtests and then just release," he explains. Related:Houghton adds that teams who are still convinced early access is right for them must have complete conviction in their creative vision—otherwise they might risk being derailed. "Of course you listen to the feedback and take the ideas, but you have to feed them through a filter, otherwise you'll have too many cooks.You'll have people who are super enthusiastic who think you can't do anything wrong, and then you get the people who are just super negative and people who are just throwing ideas at you that you know won't work in the game." It's a situation that can feel overwhelming, with Houghton explaining that Rogue Prince of Persia's game director found the onslaught "too much" at times. "You have to stick to your vision," says Houghton. "Listen, but just cherry pickcarefully."Game Developer attended Gamescom 2025 via the Gamescom Media Ambassador Program, which covered flights and accommodation. about:GamescomTop StoriesInterviewsAbout the AuthorChris KerrSenior Editor, News, GameDeveloper.comGame Developer news editor Chris Kerr is an award-winning journalist and reporter with over a decade of experience in the game industry. His byline has appeared in notable print and digital publications including Edge, Stuff, Wireframe, International Business Times, and PocketGamer.biz. Throughout his career, Chris has covered major industry events including GDC, PAX Australia, Gamescom, Paris Games Week, and Develop Brighton. He has featured on the judging panel at The Develop Star Awards on multiple occasions and appeared on BBC Radio 5 Live to discuss breaking news.See more from Chris KerrDaily news, dev blogs, and stories from Game Developer straight to your inboxStay UpdatedYou May Also Like
    #evil #empire #tells #devs #avoid
    Evil Empire tells devs to avoid early access unless their project is 90 percent complete
    Chris Kerr, Senior Editor, News, GameDeveloper.comAugust 22, 20254 Min ReadVia Evil Empire/UbisoftThe Rogue Prince of Persia developer Evil Empire doesn't believe it's worth entering early access in the current climate unless your project is at least 90 percent complete. Speaking to Game Developer at Gamescom 2025, studio marketing manager Matthew Houghton and art director Dylan Eurlings shared their thoughts on the state of contemporary early access campaigns and suggested the system is becoming increasingly risky. You might recognize Evil Empire as the studio that has spent years helping Dead Cells developer Motion Twin expand the franchise with DLC and free updates. Since May 2024, however, the studio has also been shepherding The Rogue Prince of Persia through an Steam Early Access campaign that culminated in an official launch on August 20.Yet, due to a variety of factors including shifting player expectations, the team had to tweak their pre-release strategy in a bid to lure in players. As a result, the studio has become more wary of early access in general. "To be honest, the way I see it now is that unless you're coming into early access with a 90 percent complete game, don't do it. Because players, they don't see it as early access, they see it a game to play," says Houghton. He adds that players are entitled to express their views because they're often paying money to play early access titles, but he feels that sifting viewpoint means consumers are less willing to follow games on an early access journey that will often result in significant changes. Related:Eurlings echoes that point and suggests that a perceived lack of polish and content can now be a "huge issue" during early access campaigns. He explains that Evil Empire was even forced to rethink their original roadmap after the earliest versions of Rogue Prince of Persia failed to meet internal expectations."Initially we wanted to do quick updates. Very fast. In the end, we took a bit more time to ensure that each update would be a bit more chunky," he adds, noting that pivot resulted in a steady cadence of monthly updates that each packed a fair bit of clout. That shift came after the team conceded that early access numbers "weren't great." Houghton explains the project initially attracted just under 1,000 peak concurrent users. After they flipped the script, the title started to pull in around 4,000 CCU. "That's why we wanted to prioritize chunky content—there would be an impact every time," says Houghton, before acknowledging the title still hadn't quite met the team's CCU expectations even after that switch-up. Related:"Of course you listen to the feedback and take the ideas, but you have to feed them through a filter."It underlines the risks that come with early access, and Houghton specifically wonders whether some developers who can't hit that 90 percent completion benchmark before launching into early access might instead be better off conducting beta tests through Steam instead. If you're curious as to where Rogue Prince of Persia was when it entered Early Access, Houghton suggests the project was roughly 60 percent complete. Eurlings, was slightly more conservative, and claimed it was more like 50 percent. We suggested they meet in the middle. "Steam has evolved now. You can do beta tests and playtests through Steam. I think that's become more like Early Access. People aren't paying for it and your project might be a bit jankybut that's okay because they're still going to give feedback and it's not going to be a disaster if it's not great yet," continues Houghton. When asked whether Evil Empire would consider revisiting Early Access in the future, Houghton isn't so certain."I don't know. I'm going to be honest. I've been put off by it, because especially now people are so used to games coming out and then doing live ops for three, four, and five years—so why go with early access and have to deal with the stigma that's around it? I think I would do playtests and then just release," he explains. Related:Houghton adds that teams who are still convinced early access is right for them must have complete conviction in their creative vision—otherwise they might risk being derailed. "Of course you listen to the feedback and take the ideas, but you have to feed them through a filter, otherwise you'll have too many cooks.You'll have people who are super enthusiastic who think you can't do anything wrong, and then you get the people who are just super negative and people who are just throwing ideas at you that you know won't work in the game." It's a situation that can feel overwhelming, with Houghton explaining that Rogue Prince of Persia's game director found the onslaught "too much" at times. "You have to stick to your vision," says Houghton. "Listen, but just cherry pickcarefully."Game Developer attended Gamescom 2025 via the Gamescom Media Ambassador Program, which covered flights and accommodation. about:GamescomTop StoriesInterviewsAbout the AuthorChris KerrSenior Editor, News, GameDeveloper.comGame Developer news editor Chris Kerr is an award-winning journalist and reporter with over a decade of experience in the game industry. His byline has appeared in notable print and digital publications including Edge, Stuff, Wireframe, International Business Times, and PocketGamer.biz. Throughout his career, Chris has covered major industry events including GDC, PAX Australia, Gamescom, Paris Games Week, and Develop Brighton. He has featured on the judging panel at The Develop Star Awards on multiple occasions and appeared on BBC Radio 5 Live to discuss breaking news.See more from Chris KerrDaily news, dev blogs, and stories from Game Developer straight to your inboxStay UpdatedYou May Also Like #evil #empire #tells #devs #avoid
    Evil Empire tells devs to avoid early access unless their project is 90 percent complete
    www.gamedeveloper.com
    Chris Kerr, Senior Editor, News, GameDeveloper.comAugust 22, 20254 Min ReadVia Evil Empire/UbisoftThe Rogue Prince of Persia developer Evil Empire doesn't believe it's worth entering early access in the current climate unless your project is at least 90 percent complete. Speaking to Game Developer at Gamescom 2025, studio marketing manager Matthew Houghton and art director Dylan Eurlings shared their thoughts on the state of contemporary early access campaigns and suggested the system is becoming increasingly risky. You might recognize Evil Empire as the studio that has spent years helping Dead Cells developer Motion Twin expand the franchise with DLC and free updates. Since May 2024, however, the studio has also been shepherding The Rogue Prince of Persia through an Steam Early Access campaign that culminated in an official launch on August 20.Yet, due to a variety of factors including shifting player expectations, the team had to tweak their pre-release strategy in a bid to lure in players. As a result, the studio has become more wary of early access in general. "To be honest, the way I see it now is that unless you're coming into early access with a 90 percent complete game, don't do it. Because players, they don't see it as early access, they see it a game to play," says Houghton. He adds that players are entitled to express their views because they're often paying money to play early access titles, but he feels that sifting viewpoint means consumers are less willing to follow games on an early access journey that will often result in significant changes. Related:Eurlings echoes that point and suggests that a perceived lack of polish and content can now be a "huge issue" during early access campaigns. He explains that Evil Empire was even forced to rethink their original roadmap after the earliest versions of Rogue Prince of Persia failed to meet internal expectations."Initially we wanted to do quick updates. Very fast. In the end, we took a bit more time to ensure that each update would be a bit more chunky," he adds, noting that pivot resulted in a steady cadence of monthly updates that each packed a fair bit of clout. That shift came after the team conceded that early access numbers "weren't great." Houghton explains the project initially attracted just under 1,000 peak concurrent users (CCU). After they flipped the script, the title started to pull in around 4,000 CCU. "That's why we wanted to prioritize chunky content—there would be an impact every time," says Houghton, before acknowledging the title still hadn't quite met the team's CCU expectations even after that switch-up. Related:"Of course you listen to the feedback and take the ideas [on board], but you have to feed them through a filter."It underlines the risks that come with early access, and Houghton specifically wonders whether some developers who can't hit that 90 percent completion benchmark before launching into early access might instead be better off conducting beta tests through Steam instead. If you're curious as to where Rogue Prince of Persia was when it entered Early Access, Houghton suggests the project was roughly 60 percent complete. Eurlings, was slightly more conservative, and claimed it was more like 50 percent. We suggested they meet in the middle. "Steam has evolved now. You can do beta tests and playtests through Steam. I think that's become more like Early Access. People aren't paying for it and your project might be a bit janky [...] but that's okay because they're still going to give feedback and it's not going to be a disaster if it's not great yet," continues Houghton. When asked whether Evil Empire would consider revisiting Early Access in the future, Houghton isn't so certain."I don't know. I'm going to be honest. I've been put off by it, because especially now people are so used to games coming out and then doing live ops for three, four, and five years—so why go with early access and have to deal with the stigma that's around it? I think I would do playtests and then just release," he explains. Related:Houghton adds that teams who are still convinced early access is right for them must have complete conviction in their creative vision—otherwise they might risk being derailed. "Of course you listen to the feedback and take the ideas [on board], but you have to feed them through a filter, otherwise you'll have too many cooks. [...] You'll have people who are super enthusiastic who think you can't do anything wrong, and then you get the people who are just super negative and people who are just throwing ideas at you that you know won't work in the game." It's a situation that can feel overwhelming, with Houghton explaining that Rogue Prince of Persia's game director found the onslaught "too much" at times. "You have to stick to your vision," says Houghton. "Listen, but just cherry pick [your feedback] carefully."Game Developer attended Gamescom 2025 via the Gamescom Media Ambassador Program, which covered flights and accommodation.Read more about:GamescomTop StoriesInterviewsAbout the AuthorChris KerrSenior Editor, News, GameDeveloper.comGame Developer news editor Chris Kerr is an award-winning journalist and reporter with over a decade of experience in the game industry. His byline has appeared in notable print and digital publications including Edge, Stuff, Wireframe, International Business Times, and PocketGamer.biz. Throughout his career, Chris has covered major industry events including GDC, PAX Australia, Gamescom, Paris Games Week, and Develop Brighton. He has featured on the judging panel at The Develop Star Awards on multiple occasions and appeared on BBC Radio 5 Live to discuss breaking news.See more from Chris KerrDaily news, dev blogs, and stories from Game Developer straight to your inboxStay UpdatedYou May Also Like
    Like
    Love
    Wow
    Sad
    Angry
    457
    · 2 Commentaires ·0 Parts
  • واش راكم؟ عندي موضوع مثير حاب نحكي عليه!

    سمعتو بقصة CEO تاع Coinbase، براين آرمسترونغ، اللي قرر يشطّب على بعض المهندسين اللي ما حاولوش يستعملو الذكاء الاصطناعي من البداية؟ بعد ما خذو تراخيص لكل المهندسين، قلوه باللي التبني راح يكون بطيء ومع الوقت باش يوصلو حتى نصفهم يستعملو AI. شوفو فين وصلنا!

    بصراحة، النقص في الحماس هذا يخليني نفكر كيفاش التكنولوجيا تقدر تعطي دفع للابتكار. مرات الناس تخاف من التغيير، لكن في عالم سريع التغيير، لازم نكونو جاهزين.

    إذا كنتوا معايا في هذا الرأي، راح نحتاجو نكونو أكثر شجاعة ونتقبلو التحديات.

    https://techcrunch.com/2025/08/22/coinbase-ceo-explains-why-he-fired-engineers-who-didnt-try-ai-immediately/

    #التكنولوجيا #الذكاء_الاصطناعي #Coinbase
    🔥 واش راكم؟ عندي موضوع مثير حاب نحكي عليه! سمعتو بقصة CEO تاع Coinbase، براين آرمسترونغ، اللي قرر يشطّب على بعض المهندسين اللي ما حاولوش يستعملو الذكاء الاصطناعي من البداية؟ بعد ما خذو تراخيص لكل المهندسين، قلوه باللي التبني راح يكون بطيء ومع الوقت باش يوصلو حتى نصفهم يستعملو AI. شوفو فين وصلنا! 😲 بصراحة، النقص في الحماس هذا يخليني نفكر كيفاش التكنولوجيا تقدر تعطي دفع للابتكار. مرات الناس تخاف من التغيير، لكن في عالم سريع التغيير، لازم نكونو جاهزين. إذا كنتوا معايا في هذا الرأي، راح نحتاجو نكونو أكثر شجاعة ونتقبلو التحديات. https://techcrunch.com/2025/08/22/coinbase-ceo-explains-why-he-fired-engineers-who-didnt-try-ai-immediately/ #التكنولوجيا #الذكاء_الاصطناعي #Coinbase
    techcrunch.com
    After getting licenses to cover every engineer, some at the cryptocurrency exchange warned Armstrong that adoption would be slow, predicting it would take months to get even half the engineers using AI. 
    Like
    Love
    Wow
    Sad
    Angry
    484
    · 1 Commentaires ·0 Parts
  • يا جماعة الخير، هل فكرتوا يومًا في مدى خطورة التنبؤات التي تقوم بها الأنظمة الذكية في عالم المال؟

    في مقالي الجديد، نتناول موضوع مهم: "Why LLM predictions are dangerous for Finance?"، وكاش ما أكثر من مجرد داتا، هاد التنبؤات تقدر تكون سلاح ذو حدين في عالم الاستثمار. الفكرة هي أننا لازم نفهم العواقب المحتملة لتغييرات صغيرة في النماذج، خصوصًا في قطاع حساس كيما المالية. هذي التكنولوجيات تقدر تجيب أرقام شبه دقيقة، لكن في بعض الأحيان، النتيجة تكون كارثية.

    شخصيًا، عشت تجربة مع تداول يعتمد على تحليل دقيق، لكن وعلى الرغم من كل هاد الذكاء الصناعي، تبقى الخيارات النهائية بيد الإنسان، وهادي هي النقطة المفصلية. لازم نكون واعيين بمسؤوليتنا.

    فلتقضوا بعض الوقت في التفكير في هاد الموضوع، لأن المستقبل يتطلب منا اختيارات حذرة.

    https://dev.to/pvgomes/why-llm-predictions-are-dangerous-for-finance-j2i
    يا جماعة الخير، هل فكرتوا يومًا في مدى خطورة التنبؤات التي تقوم بها الأنظمة الذكية في عالم المال؟ 🤔 في مقالي الجديد، نتناول موضوع مهم: "Why LLM predictions are dangerous for Finance?"، وكاش ما أكثر من مجرد داتا، هاد التنبؤات تقدر تكون سلاح ذو حدين في عالم الاستثمار. الفكرة هي أننا لازم نفهم العواقب المحتملة لتغييرات صغيرة في النماذج، خصوصًا في قطاع حساس كيما المالية. هذي التكنولوجيات تقدر تجيب أرقام شبه دقيقة، لكن في بعض الأحيان، النتيجة تكون كارثية. شخصيًا، عشت تجربة مع تداول يعتمد على تحليل دقيق، لكن وعلى الرغم من كل هاد الذكاء الصناعي، تبقى الخيارات النهائية بيد الإنسان، وهادي هي النقطة المفصلية. لازم نكون واعيين بمسؤوليتنا. فلتقضوا بعض الوقت في التفكير في هاد الموضوع، لأن المستقبل يتطلب منا اختيارات حذرة. https://dev.to/pvgomes/why-llm-predictions-are-dangerous-for-finance-j2i
    Like
    Love
    Wow
    Angry
    Sad
    536
    · 1 Commentaires ·0 Parts
  • هل تعرفوا أنو 95% من المشاريع اللي تعتمد على الذكاء الاصطناعي تفشل؟

    مقال جديد من MIT ينبهنا على هاد الحقيقة الصادمة، لكن المفاجأة مش في النسبة العالية، بل في الأسباب وراء هاد الفشل. الدراسة تقول بلي المشكلة موش في تقنيات الذكاء الاصطناعي ولكن في كيفاش الشركات تحاول تستعملها. بمعنى آخر، الطريقة اللي نديروا بها الأمور هي اللي تتسبب في الفشل!

    شخصيًا، شاهدت بعض الشركات اللي تحط استثمارات كبيرة في AI بلا ما تفهم شوية أسس كيفاش تتعامل معاه. للأسف، يضيعوا وقتهم وفلوسهم. لازم نفكّروا مليح قبل ما نغوصوا في هاد الميدان.

    هذا الموضوع يفتح لنا عيوننا على ضرورة الفهم العميق لطبيعة الذكاء الاصطناعي وكيفاش نقدروا نستفيدوا منو بالشكل الصحيح.

    https://fortune.com/2025/08/21/an-mit-report-that-95-of-ai-pilots-fail-spooked-investors-but-the-reason-why-those-pil
    هل تعرفوا أنو 95% من المشاريع اللي تعتمد على الذكاء الاصطناعي تفشل؟ 🚀 مقال جديد من MIT ينبهنا على هاد الحقيقة الصادمة، لكن المفاجأة مش في النسبة العالية، بل في الأسباب وراء هاد الفشل. الدراسة تقول بلي المشكلة موش في تقنيات الذكاء الاصطناعي ولكن في كيفاش الشركات تحاول تستعملها. بمعنى آخر، الطريقة اللي نديروا بها الأمور هي اللي تتسبب في الفشل! شخصيًا، شاهدت بعض الشركات اللي تحط استثمارات كبيرة في AI بلا ما تفهم شوية أسس كيفاش تتعامل معاه. للأسف، يضيعوا وقتهم وفلوسهم. لازم نفكّروا مليح قبل ما نغوصوا في هاد الميدان. هذا الموضوع يفتح لنا عيوننا على ضرورة الفهم العميق لطبيعة الذكاء الاصطناعي وكيفاش نقدروا نستفيدوا منو بالشكل الصحيح. https://fortune.com/2025/08/21/an-mit-report-that-95-of-ai-pilots-fail-spooked-investors-but-the-reason-why-those-pil
    fortune.com
    The lessons from the MIT study were less about what's wrong with AI models and more about what's wrong with the way companies are trying to use them
    Like
    Love
    Wow
    Sad
    Angry
    212
    · 1 Commentaires ·0 Parts
  • واش راكم يا جماعة؟

    اليوم حبيت نهدرلكم على موضوع مهم بزاف: "علاش استقلال البنك المركزي مهم؟ شوفوا كيفاش كان الحال في إقتصاد رونالد ريغان مع باول فولكر عندما زاد الفوائد لقرابة 20%!"

    المقال يقول بلي البنوك المركزية المستقلة تقدر تدير قرارات صعبة، كيما رفع الفائدة لمحاربة التضخم، وهاد الشي يجعل الاقتراض أغلى. يعني، إذا كنت تخطط تشري دار أو تدير مشروع، لازم تكون واعي بلي هاد القرارات تقدر تأثر عليك بشكل مباشر.

    أنا شفت كيفاش هاد الأمور تأثر في عائلتي، كيفاش كانوا يحاولوا يحققوا أحلامهم في وقت كانت الفوائد عالية. المهم، أن تكون واعي بلي الاقتصاد موش ساهل، وقرارات البنوك تقدر تكون حاسمة.

    خليكم دايمًا مطلعين على الأخبار والقرارات الاقتصادية، لعلها تفيدكم في حياتكم اليومية.

    https://fortune.com/2025/08/22/why-federal-reserve-independ
    واش راكم يا جماعة؟ 🤔 اليوم حبيت نهدرلكم على موضوع مهم بزاف: "علاش استقلال البنك المركزي مهم؟ شوفوا كيفاش كان الحال في إقتصاد رونالد ريغان مع باول فولكر عندما زاد الفوائد لقرابة 20%!" 📈💰 المقال يقول بلي البنوك المركزية المستقلة تقدر تدير قرارات صعبة، كيما رفع الفائدة لمحاربة التضخم، وهاد الشي يجعل الاقتراض أغلى. يعني، إذا كنت تخطط تشري دار أو تدير مشروع، لازم تكون واعي بلي هاد القرارات تقدر تأثر عليك بشكل مباشر. أنا شفت كيفاش هاد الأمور تأثر في عائلتي، كيفاش كانوا يحاولوا يحققوا أحلامهم في وقت كانت الفوائد عالية. المهم، أن تكون واعي بلي الاقتصاد موش ساهل، وقرارات البنوك تقدر تكون حاسمة. خليكم دايمًا مطلعين على الأخبار والقرارات الاقتصادية، لعلها تفيدكم في حياتكم اليومية. https://fortune.com/2025/08/22/why-federal-reserve-independ
    fortune.com
    Independent central banks can more easily take unpopular steps to fight inflation, such as raise interest rates, which makes borrowing more expensive.
    Like
    Love
    Wow
    Angry
    Sad
    217
    · 1 Commentaires ·0 Parts
  • يا جماعة، شفتوا أداء Wirtz مع ليفربول؟

    راهو كان عنده أول مشاركة له، رغم أنه ما كانش في أفضل حالاته، لكن هذا اللاعب عنده لمسة فنية راقية! قالوا عنه "عندو touches silk"، وما شاء الله راح تلاحظوا الفرق في الجولات الجاية. الأجواء تتغير، والشغف يطلع معاه!

    من خلال مشاهدتي، حسيت بأن عنده إمكانيات كبيرة وقدرة على التألق. كلي حماس أشوفه يلعب مرة أخرى ويظهر موهبته أكثر! كنت دايماً نآمن بأن اللاعبين الجدد يحتاجوا شوية وقت للانسجام، لكن هذا Wirtz، حاسيه راح يكون له تأثير كبير على الفريق.

    خلينا نتفائل ونشوف وين راح يوصل!

    https://www.skysports.com/football/news/11095/13415669/florian-wirtz-is-liverpools-running-man-why-the-germany-internationals-work-rate-is-just-as-important-as-his-technique
    #ليفربول #فوتبول #FlorianWirt
    يا جماعة، شفتوا أداء Wirtz مع ليفربول؟ 🥳 راهو كان عنده أول مشاركة له، رغم أنه ما كانش في أفضل حالاته، لكن هذا اللاعب عنده لمسة فنية راقية! ✨ قالوا عنه "عندو touches silk"، وما شاء الله راح تلاحظوا الفرق في الجولات الجاية. الأجواء تتغير، والشغف يطلع معاه! من خلال مشاهدتي، حسيت بأن عنده إمكانيات كبيرة وقدرة على التألق. كلي حماس أشوفه يلعب مرة أخرى ويظهر موهبته أكثر! كنت دايماً نآمن بأن اللاعبين الجدد يحتاجوا شوية وقت للانسجام، لكن هذا Wirtz، حاسيه راح يكون له تأثير كبير على الفريق. خلينا نتفائل ونشوف وين راح يوصل! 🤩 https://www.skysports.com/football/news/11095/13415669/florian-wirtz-is-liverpools-running-man-why-the-germany-internationals-work-rate-is-just-as-important-as-his-technique #ليفربول #فوتبول #FlorianWirt
    The Wirtz stat you might have missed on his Liverpool debut
    www.skysports.com
    “He got brought off on Friday night and probably was not at his absolute best but he still had some silky, lovely touches and when you watch him play, there is something there. I am already looking forward to seeing him next Monday.”
    1 Commentaires ·0 Parts
Plus de résultats
ollo https://www.ollo.ws