Normal view
Wolfjaw CEO: AI Will Make Good Developers Great, But It Won’t Save Bad Ones
The debate over the use of generative AI tools for game development has raged across the industry for well over a year, and it doesn't seem to be letting up any time soon. As with anything related to AI, opinions are often polarized between those who want to take advantage of the new technology to improve games or simply to speed up the ever-growing development times, and others who are revolted at the mere idea of using it for any creative endeavor. In my most recent interview, Mitchell Patterson, CEO of backend developer Wolfjaw Studios, stressed that AI won't magically […]
Read full article at https://wccftech.com/wolfjaw-ceo-ai-game-development-good-developers/

-
Wccftech
- “Most People Bashing DLSS 5 Are on the Peak of Ignorance” – Veteran Game Artist Shows Exactly How Much of a Difference Lighting Can Make
“Most People Bashing DLSS 5 Are on the Peak of Ignorance” – Veteran Game Artist Shows Exactly How Much of a Difference Lighting Can Make
To say that the reveal of NVIDIA DLSS 5 at GTC 2026 sparked a controversy on social media would be the understatement of the year. Despite strong praise from veteran tech journalists who experienced the in-person demo, such as Digital Foundry and Ryan Shrout, and the reassurance from both NVIDIA and game studios like Bethesda that the (work-in-progress) tech is fully under developer control and optional anyway, the vocal anti-AI crowd hasn't stopped condemning NVIDIA's newest DLSS addition for being "just an AI filter" and "disrespectful of the developer's original design". However, DLSS 5 also has its proponents, including veteran […]
Read full article at https://wccftech.com/dlss-5-veteran-artist-lighting-difference-georgian-avasilcutei/

Razer goes all-in on AI at GDC 2026
Razer has chosen the GDC 2026 stage to reveal a comprehensive suite of AI-powered tools designed to transform how games are developed and experienced. With the gaming market projected to hit $206.5 billion by 2028, Razer's “Future of Play” showcase highlights a shift toward “agentic” development, using AI not just to generate assets but to manage complex workflows, automate quality assurance, and drive multi-sensory immersion.
Starting with Ava, what started as a 3D hologram desk companion at CES 2026 has evolved into an AI agent. Razer Ava has transitioned from a reactive chatbot to a proactive assistant capable of understanding user intent and executing multi-step workflows across various apps and services. This is powered by the new Razer Inference Control Plane, which routes tasks between local and cloud models to maintain low latency. AVA can now interface with third-party apps like Spotify and coordinate with other AVA units to handle scheduling or meeting proposals. By managing the setup and coordination, Razer positions Ava as a tool for developers and casual gamers. Beta sign-ups for Razer AVA are currently open via Razer Cortex, with early access invitations expected to begin rolling out in the second quarter of 2026.
Razer is also addressing quality assurance processes with its updated QA Companion-AI. The main feature of this 2026 update is its “zero-integration” deployment, meaning it requires no SDKs or code changes to function. It operates through a vision-based system that analyses gameplay footage to detect rendering, physics, and animation bugs. Beyond simple detection, the tool can now generate functional and negative test cases directly from developer prompts or game design documents. Autonomous gameplay agents are also in development to execute these test cases and provide pass/fail summaries without scripting. By automating the reproduction steps and reporting, Razer aims to accelerate the QA cycle for studios of all sizes.
Rounding out is the Razer Adaptive Immersive Experience, a new runtime that unifies haptics, lighting, and audio into the WYVRN developer ecosystem. This system is designed to reduce the time developers spend tuning sensory effects to as little as three days by providing a plug-and-play library compatible with Unity and Unreal Engine. The system uses “Dynamic Haptics” to blend designer-authored effects with real-time “Audio-to-Haptics” (A2H) conversion. This allows the game to provide a consistent ambient baseline of tactile feedback even in moments where developers haven't manually scripted an effect. Built on the foundation of Razer Sensa HD Haptics, Razer Chroma RGB, and THX Spatial Audio+, the runtime intelligently adapts to in-game signals without overriding the studio's creative intent. This immersive layer will begin its phased rollout in the first quarter of 2026.
KitGuru says: Have you ever thought Razer would get into the developer market segment? What do you think of this new venture for the company?
The post Razer goes all-in on AI at GDC 2026 first appeared on KitGuru.