OBJECTIVES

Xynesis is a spin-off of SSRC and offers producers of video games and computer animations three products: first, a rendering technology called NATURAL SHAPE APPROXIMATION; second, a programming language called QR that extends and supersedes C++; and third, middleware that streamlines production and enhances productivity.

We believe video games are about making things possible. We have a vision of what digital reality ultimately can be and what it is that drives video games.

ETYMOLOGY

Xynesis ['zinisis] is ancient Greek and means the innate power of the mind in its highest form, uncompromised intuition, discipline and ethics. In ancient Greek literature the term Xynesis is closely related to the act of uniting, sagacity and conscience. The Greek historian Thucydides, 5th century B.C., spoke of Xynesis as the highest form of human virtue and concluded that after the end of the Periklean Age (444-429 B.C.) no Greek leader possessed this quality any more. In the centuries that followed the word had faded from the Greek language and been lost.

XYNESIS

We take the challenge inherent in Xynesis' etymological origin to our heart. We believe that the realities springing forth from our digital animations reflect human desires and dreams. Video games could ultimately resemble life's challenges in all its facets. Yet the reality falls short of the potential.

How often have you desired that the video game you were playing allowed you to use the environment in the game a little bit more intelligently, to use a tool or specifics of a location to your advantage, for a cool idea you just had? The visuals were all there, but the game just didn't allow you to realize your plan. Our intuitive reaction is to hold the game designer responsible for the limitation. But a closer look reveals that the reason is really a limit of paradigm, that artwork and function (code) are produced separately.

We believe that The Matrix by the Wachowski brothers is right about one thing: that everything could be animated by tiny digital building blocks. For this to work there must be a xynthesis of artwork and code, some prototype building block adaptable to any form and context. But it also requires some form of genesis that lends these building blocks inherent qualities that provide for interactive forces between them and the capability to mould and shape increasingly complex objects from them. This is the motivation for the creation of Xynesis' technology, to develop a process that does this simple, straightforward and efficient, a process to express reality.

SOLUTIONS FOR A HIGHLY COMPETITIVE MARKET

Nobody working in the video games industry doubts that there is enormous pressure on publishers and developers alike. On the one hand both have grown increasingly conscious of costs. On the other hand the consumer's ever growing desire for more interesting, absorbing experiences presents a permanent challenge. In today's highly competitive environment, it has become increasingly difficult to distinguish a new product and to rise the bar of what matters most in the market: the experience lived through by the player while interacting with the game world. The producer has two issues to focus on: content and the means of delivering the experience in an immersive interactive way. This leads to a highly demanding production process.

Given the growing pressure on producers to save on costs on the one hand and to produce more interesting, immersive content and playing experiences on the other hand, the production process is highly dependent on technology, and its efficiency is a critical factor for success and a competitive advantage.

The products Xynesis offers push the edge in three distinct areas: rendering and graphics performance; advanced AI and persistent physics; plus productivity in creating interactive animations. Please read on.

NATURAL SHAPE APPROXIMATION

This is a recently developed innovative approach to rendering entire 3D scenes without any spatial computations. It is significantly faster than existing technology that relies on wire frame coordinates in 3D space. The Natural Shape Approximation utilizes a simple object relation graph, the nodes of which represent objects in a scene and the edges the relative distance between them. The information associated with the graph consists only of one fuzzy logic term for each distance, e.g. far, near, close, adjacent, joined, equal to, and two parameters for each object, its volume and material. Based on this information alone, the entire layout of a scene and the shapes of the objects are determined.

At the core of the technology are three steps that combined create highly realistic intricate and complex shapes. First, a fractal logic, that works similar to how the foci of an ellipse determine its shape, is applied; the interrelations of those objects that are marked joined in the graph represent the parameters for this process and control how more complex shapes are constructed. Second, any adjacent objects are factored in and impact the final shape; this applies as well to liquids, e.g. a body of water that takes its shape from its surroundings, as to soft bodies that are squeezed dependent on material properties. Third, the final projection of the 3D scene on a two-dimensional screen is performed by an algorithm that traverses the objects in the focus of the camera. This approach supports incremental rendering strategies that adjust the level of detail on a per object basis or based on distance.

Overall the superior performance allows for high resolutions with less resources and a finer control of the level of detail attributable to individual objects. Also, with this technology collision detection becomes obsolete for the purposes of visualization; the object relation graph ensures mutual exclusion. Regarding transparency and reflection, the rendering technology incorporates a materials based model that delivers correct visuals in the case that the camera views a scene e.g. through water or glass. Existing tools can remain in use; conventional wire frame meshes could easily be transformed into the object relation graph used by the Natural Shape Approximation.

QR

QR is a general purpose programming language that is designed for the serious programmer to make the development of applications that have real-time requirements or require a high volume of interactions between objects more efficient. QR is interaction centric and focuses on effective means to model and structure the way objects interact. A programmer can partition an application into manageable pieces by defining the interactions within a system and where they take place.

QR brings to the realm of application development what XML did for markup languages: it adds wellformedness and super-structure without overhead. QR combines the best from programming and scripting without their drawbacks. It manages to improve simultaneously the flexibility and structure of the software engineering process. In the following you find a brief tour of QR and its main features.

Probably the most significant difference to C++ is the way how QR implements the notion of abstraction. In QR an object is essentially an abstract shell; its properties depend entirely on the interactions that can be invoked on it. Unlike in C++, these interactions are not members of an object, each object merely specifies classes of interactions that are applicable to it. From a programmer's point of view code is written only for the interactions, there is no need to implement visible or hidden data structures for an object. The entire structure and logic of a QR application rests in the definitions of its interactions and in the relationship between the objects that interact with each other. The interface of a QR object is simply the set of interactions applicable to it. This simplifies application development substantially without sacrificing flexibility or performance.

While QR objects are essentially abstract - code is written only for the interactions - this in no way prevents multilayered object structure but actually unlocks its real power. Multilayered object structure is achieved in QR by exploiting the inherent interface character each object has. An object is said to enclose and to have inner objects if these are accessible only by interactions invoked on the enclosing object. The inner objects and the interactions between them are said to live in the enclosing object's space. This makes the definition of encapsulated modules and name spaces intuitive and straightforward. Inner objects can have inner objects as well. This simple mechanism to layer structure recursively over structure provides a remarkably powerful way to design intricate application behaviour without incurring overhead. To further aid the development process and to allow for more flexible and adaptable objects, QR's interactions have a common super class, can be extended and follow the same rules that made C++'s polymorphism a key for efficient application design.

QR combines the benefits of bottom-up and top-down software development strategies. On the one hand, QR enables the programmer to design objects straightforward similar to how nature assembles complex organic structures bottom-up, from simple molecules over highly adaptable cells to high level organisms. On the other hand, the inherent interface character each QR object has provides a very natural and intuitive form of modularisation that allows the programmer to develop applications top-down by first defining abstract high level objects and successively adding internal structure on demand. QR is effectively a rapid prototyping system that eliminates many side effects of manually designing C++ members and also automates interface and module definition. All in all this leads to faster development cycles and more robust, more easy to maintain programs.

The key concepts of QR are node and link. A node is essentially an abstract placeholder for an object. The information in the node defines which interactions are applicable to it. Associated with a node are links. Each link defines a target node and the interaction to be invoked on the target. Interaction processing continues from node to node. The result of an interaction invoked on a target node determines which links are processed next. This mechanism presents QR's primary control of execution; the application's structure can be depicted as a graph, comprised of nodes and edges that represent the links between nodes. Processing continues along paths in the graph.

A node contains no program logic or methods but consists only of two tables. The first, called interactivity table, defines the interactions of which this node is a valid target. The second, called interaction response table, is an ordered list of links that specify the interactions that are triggered in response after an interaction targeting this node has been processed. The interactions specified by the interaction response table are invoked in the order given until one interaction acknowledges with ACCEPT. At this point processing continues at the node targeted by this interaction.

Interactions can ACCEPT, PASS, REFLECT, BLOCK or SPAWN an invocation; PASS invokes the next interaction listed, REFLECT returns the flow of control to the triggering interaction, BLOCK terminates the process and SPAWN is similar to ACCEPT but allows for another interaction to be invoked at the same node; effectively it creates a new process and forkes the execution path. This structured but yet simple mechanism of invoking interactions provides QR with a sophisticated control of execution that enables adaptable context-sensitive parallel execution paths for distributed processing. This takes a huge workload from the programmer who merely has to specify an object's interactivity regardless of the target system's architecture; QR distributes processing automatically to multiple cores or other computers, if available. It also provides a highly adaptable security model directly supported by the language: the entries with top priority in the interaction response table can be allocated for validation and access control.

QR provides flexible and efficient facilities that support well structured applications. Interactions can be inherited and extended to define more specialized interactions. A node that accepts a given type of interaction accepts also its subclasses. Nodes can be grouped together to provide interfaces that support more complex interactions. Interfaces to local interaction patterns not accessible otherwise enforce encapsulation and effectively define a module. The module's interface represents it externally. The local interaction patterns are defined within the module's local namespace and present its internal logic. Modules can be cloned and extended in object-orientated fashion and used anywhere instead of a node. This degree of flexibility encourages hierarchical program construction with a high level of modularity and reusability.

QR applications can easily interface with any other technology; an interaction that communicates with another application is called a gateway and can be connected to any QR link. The structure of a QR application is not static but supports the dynamic (re)negotiation of node/link structures at runtime and the import of interactions on an as-needed basis. This makes QR highly adaptable to different contexts, gives it a small memory footprint and allows a QR application to be constructed incrementally by an automated agent, locally or remotely. It is a feasible scenario to develop applications partly in C++ and QR to benefit from each strengths. A programmer could for example use QR's strength in interaction processing for AI and physics, and employ C++ for implementing the general application logic and the management of system resources. QR's flexibility and adaptability makes it an ideal tool to innerconnect different technologies.

Highlights and key features are:

MIDDLEWARE

Xynesis' middleware builds on the individual strengths of QR and the Natural Shape Approximation and combines both to one solution that blends the creation of artwork seamlessly with the design of an application. We envision something analogous to the Wachowski Brothers' Matrix in which programs animate all aspects of life. Starting with tiny digital building blocks at the quantum level up to highly efficient algorithms to animate complex actions and objects persistently, it incorporates a whole set of enabling technologies that address the issues of persistency, scalability and interactivity; it makes game worlds feasible in which objects could be broken into components or formed from raw or processed elements and share inherent universally adaptable forces that give them unlimited interactivity. QR enables us to implement this efficiently and the Natural Shape Approximation provides us with the facilities to animate these effects visually.

A brief overview of the technology and a demo currently under development is available here: OVERVIEW DEMO (pdf, 29 KB). The first demo will be released synchronously with QR.

STATUS

The Natural Shape Approximation is currently subject to an effort to patent the technology. Once this process is complete, we plan to license it to graphics hardware manufacturers. Its use will most likely be free for software developers.

QR, its IDE and runtime component will be released in one comprehensive package that provides a one-stop solution for developers. For up-to-date infos and release dates please check the following news section.

++++++++++++++++++++++++++++++++++

[17 February 2013]
Nearing breakthrough
Again things took time, much longer than anticipated.

In a way the subject of the work itself took precedence, not in the fashion of a well defined roadmap but rather an unwavering commitment to reach our ultimate goal without compromises. The work on this started in October 2011 and went from research to prototyping. The goal: a universal AI that sits logically below game mechanics and design, an AI that is universal in the sense that it can adapt to a wide variety of possibly dynamically changing game mechanics that logically sit atop the AI. It's by far the most advanced programming work I have ever undertaken, and looking back at the research and concepts required, ranging from principles of quantum mechanics to independently operating cells, it ranks probably among the most complex research tasks in living memory.

We are nearly there. There is no doubt anymore that we will meet our goals in full. A fully functional prototype that will be applied first for our Heroes of Might and Magic V showcase is around the corner. It has to be said, if I had had a clearly defined roadmap like in a commercial development, the project would have been completed more early but with a lot of limitations and compromises. It turned out that escpecially the last four extended months for this work were exceptionally productive. In this time most of the finer and more powerful concepts materialized in full, a lot of circuits closed and the most significant research paths came full circle, yielding exceptional results.

Many goals that I earlier anticipated to be nearly impossible to reach, such as full combinatorial prediction of very large possibility spaces, simply were solved by putting together the results of the completed individual research paths. Looking back, I would say easily over 90% of the value was created in the last four months, a vastly more powerful machine. This is a case where commercial production constraints would indeed have cut the project short and compromised most of its value.

In this sense it was fortuitous that I could perform this work as an independent, and made me appreciate that my career path didn't led me to the position of a chief technology officer in a big studio, because most likely I would never had the freedom required for this. Money and business opportunities can indeed be a limiting factor.

What comes now has the potential to be a full revolution in the digital space. It's not only a highly advanced AI that can drive all kinds of games. It has far reaching consequences, an AI that can faciliate non-linear writing, dynamic instancing in persistent online worlds, and much more. It's a bit early to begin speculating about what will be possible with the AI breakthrough, but what we have examined for our own advanced game design is very promising.

Stay tuned! /sg

[19 February 2011]
Advanced non-cheating AI
Finally! It took sort of an age but we got now the approval to publish what we had developed on the AI front.

It's a new advanced non-cheating AI for Heroes of Might and Magic. An AI for a turn-based game, how could this be challenging? Make no mistake, the AI we developed is highly ambitious and its components, for example the integrated pathfinder, had to be faster than what is typically required of a real-time strategy game. If you are interested in background information to the AI, you find it here: H5 AI. In general this AI's architecture is not limited to turn-based games. It can also be employed in all kinds of real-time settings.

Even better, this AI is now at the core of a fully fledged game project based on Heroes V. It's goal is to create a premier strategy game driven by an advanced non-cheating AI that can hold its own against veteran strategists. One objective is to demonstrate to developers worldwide what benefits a fully featured AI truly has for a game, and why it enhances the user experience substantially and adds value to many other game elements. Details to the project are available at Heroes 5.5 - Eternal Essence.

Apart from that we have begun the initial work on our own unique strategy game that will roll out the first stage of QR, including game elements made out of components created from small building blocks. More information to this will follow. /sg

[26 September 2010]
Waking up
It's now a fair bit of time since we decided to help out our parent company to develop Bond Disc. What started out as a modest security tool, soon turned into a monstrous development effort.

With a kernel mode driver and GUI application, both multi-threaded and targeted for multiple Windows platforms, we really had our work cut out. The project being a security solution added a whole lot of extra constraints. Who has done work in the Windows kernel mode and has shipped an application for multiple platforms can imagine how complex this work was. Throw in the way Microsoft tends to document features that are almost compatible from one version to another, the number of tools you have to deal with from the driver development kit and WMI to distributed COM objects, shell extensions, the install shield and CertMgr, and you get the picture.

Still, I believe it was the right decision to help our parent company to get their house in order, and looking at the product that is out now, there is music in it. And the lessons learned are valuable, not at least because QR has its own multi-threaded, automatically parallelizing kernel mode layer that drives the interactive elements.

Now that Bond Disc is out, we are back in the ring. There will be news about the prototype AI we have developed for an AAA title in the near future. /sg

[30 September 2008]
QR — when?
It's now nearly a year and some explanations are in order. There are a number of reasons that bogged us down, but the most important one is the way SSRC, our parent company, is organised. SSRC is a company specializing in research and development, and the funds we get make us to some degree financially independent. Unlike most of the venture capital funded software companies, we are not forced to deliver products based on a business plan defined road map. Instead we have a different mandate: to get it right. Everyone knows our work is extremely complex, after all we have set out to reshape information technology in the face of automatically parallelising code that is easier to manage than C++ but no less efficient. Not to mention all kinds of legal wrangling that you can get into if you try to patent technology and set up shared facilities with research partners in different parts of the world.

Telefonkarte: Motiv Transputer

Image: SSRC has its roots in massively parallel simulation and transputer research (1993)

Apart from that we had some more mundane issues. We spent significantly more resources on the development of an AI for an AAA title than originally intended. While the working prototype is finished for a good while now, there are still discussions at the publisher into which direction the next installment in the series will go and how the new AI will be employed. Of course we signed an NDA and so we cannot provide more information about this project here right now. We also had some trouble with Canada Immigration regarding visa and work permit, and because the development of the AAA title happens in Europe, we decided to relocate once more to Germany. I am really missing Vancouver's fresh air.

Where does all that leave us now? A couple weeks ago I read an interview with Epic's Tim Sweeney about the direction GPUs are taking. I couldn't agree more with the core points he makes. With Intel's Larrabee appearing on the horizon and a paradigm shift to freely programmable multi-cores and shaders, NVIDIA most likely not going to miss out on the technology's benefits and following suit, there will be substantial consequences for PC hardware and consoles as well as rendering technology. But most importantly the hardware innovation cycle has to be matched by a software innovation cycle. That is what we are aiming at, to have QR as a vital part of the next innovation cycle, essentially to provide the professional means to develop automatically scaling and parallelising applications for multi-cores in an economically feasible way. This is a tough challenge but we have gone a long way towards that goal in our work on QR. We have little doubt that this is where the world is going. /sg

[24 November 2007]
AI
We currently are doing the AI of an AAA title for one of the large publishers. We had to reshuffle some of our resources, but it was a chance too good to pass by, and there should be something for you to check out before the end of the year. /sg

[14 May 2007]
The final stretch
To simplify matters we have decided to release QR initially without the NSA but based on DirectX and OpenGL alone. Given the current development status and depending on whether some more advanced visual design features will be included in the initial release, the time frame for its completion is between 100 to 200 days from now. The first platform for the release will be Windows XP, with a release for Linux following asap. /sg

[14 May 2007]
QR and NSA
I was recently asked whether I am in favour of software patents. No, definitely not. We are patenting the Natural Shape Approximation to license it to graphics chips manufacturers because it's a uniquely new way to describe shapes dynamically and to build more efficient hardware. Software developers will be able to use the NSA for free, i.e. it will be included in the QR license. Apart from being a programming language QR has at its core a visual design philosophy that lets developers interactively adapt the objects they work at, starting with defining the relations of a simple object to a complex world. Graphics are essential for this approach and bindings for DirectX, OpenGL and later NSA will be provided. /sg

[23 January 2007]
SOHO Vancouver
We have now set up a small space to complete the work on QR in a quiet residential area in Vancouver. You can find us at #202-1451 Burnaby Street, please call us in advance at (604) 484-0922. Everything is ready to go, and there should be something for you to check out in the next couple of months. The more I play around with QR and its context-sensitive process driven approach, the more I am convinced that OOP was but one step in the ladder of the evolution of programming methodology. QR's context-sensitive interaction processing feels very natural and lends itself easily to the usually more distributed algorithms typical of AI and physics. Once adapted I guess games will start to massively push the envelope in AI and physics. Stay tuned. /sg

[21 November 2006]
Moving Around
"Nothing is ever easy." Actually an homage to Terry Goodkind's brilliant Wizard's First Rule, this fits current affairs quite well. At the beginning of January we will most likely move to a new location in Vancouver. In the meantime you can reach us at 1.604.682.5915 (full contact details). I'd also like to use the time to readdress certain aspects of the technology. I'm hesitant to set a new date for the public QR release; it will be announced once I'm finally satisfied that the product meets all our targets.

Stefan

[17 May 2006]
Moving On
We decided to set up shop proper in Vancouver. On the technological front QR's visual IDE has grown into a solution that blurs the line to Xynesis' middleware, particularly in respect of physics, AI and 3D modelling. There are a few legal issues, with setting up the company and licensing the technology, that we would rather see resolved before QR is released, which most likely delays QR's public release until the end of Q3. However, in the case that you would be interested to have a closer look at QR and would consider to incorporate it into a next generation title, please call us and we could arrange a tour for you at Xynesis later this summer.

[19 Apr 2006]
Interactivity and Game Design
We have updated Evolutionary Video Games. This article explores why games should be designed around interactivity. As technology moves forward new opportunities emerge for more refined and constructive gameplay that empowers game designers and gamers alike.

[23 Mar 2006]
Release Date
We plan to make QR available via digital download from a dedicated website. We still have to iron out the licensing agreement and to set up the server infrastructure. Work also goes on on various QR components and the demo. Given our current roadmap, we expect QR to be available on May 20th for PC and Linux.

[23 Mar 2006]
Demo
QR will be accompanied by a demo that shows its benefits and how the interaction-centric programming paradigm native to QR can push the edge in designing advanced AI and physics. We are also preparing a tutorial that explains how to use QR's visual IDE and in which way QR's application structure of nodes, links and interactions is integrated with a C++ program.

[22 Mar 2006]
QR Update
QR has undergone a number of revisions in the past months. We have been building a GUI and development environment that enables you to design advanced AI and physics straightforward from predefined interaction patterns. The initial release will be a QR/C++ hybrid that lets you visually design application superstructures interactively from QR nodes and links and define the interactions within a real-time application. The interactions are implemented using C++. Both, QR and C++, are integrated seamlessly, the application superstructure is persistently represented by C++ header files; if you prefer you can alter any aspect of a QR application from your C++ IDE.

++++++++++++++++++++++++++++++++++

MANAGEMENT TEAM

That's currently just me, Stefan Gollasch (47). I would describe myself as an inventor but am frequently called computer scientist, programmer, consultant and engineering manager. My professional background is computer science and quantum physics. From more than fifteen years professional experience of the development of innovative technologies that continually push the edge, I have a firm grasp of what is technically feasible and how to manage projects in order to reduce complexity and achieve maximum synergies. I am capable of breaking complex technical challenges into manageable parts and have excellent communications skills to describe intricate technical issues in common terms easily understood. I am equally skilled at analysing the market and the needs of producers and customers alike as well as designing and implementing solutions and the tool chains that support a project adequately.

EVOLUTIONARY VIDEO GAMES

Today you hear more frequently that people speak of a creative crisis in the video games industry. The discussion usually revolves around the question what the games are that people really want and tries to draw a line between mainstream and hardcore gamers. Unfortunately, it is this very pattern of thought that stifles creativity; the people who decide which games will be produced have a habit to cast gamers into categories and to think about their product in terms of market share and acceptance.

Video games are much more than just another product that delivers a predesigned experience like film and book. What you sell is a multidimensional experience that is interactively created by the players themselves, and there is absolutely no reason why a well designed game should cater for mainstream or hardcore gamers only. Interactivity is what sets video games apart from other media and it is the key to empower gamers, to provide them a degree of freedom that they can create an experience as they see fit. This short article examines how video games could evolve around improving interactivity as gaming technology moves forward. My focus is a bit technical and on PC games, but I am certain that the analysis and rationale holds true for all kinds of games.

In the late 1980's I started playing computer games such as Starflight, Elite, Wing Commander and Sim City. These games hadn't the same visual splendour and refinement as we see today but did an excellent job of creating atmosphere and excitement. They literally glued us to the computer. A large part of the excitement stemmed from simply exploring the gameplay, experiencing first hand in what ways the developers had adapted technology to enhance the game's depth. If the developers had done well the technology blended seamlessly into an immersive experience, giving us the feeling to go higher, farther, bolder to where we haven't been before. Nevertheless those of us who understood how games were developed couldn't help but marvel at the human mind who mastered technology that drives our imagination.

While part of the technological progress was obviously dependent on hardware like memory, CPU speed and storage capacity, it was always clear that the real engineering challenge was to implement as much of any set of desired features as the limited resources permitted. Technologically there was a level playing field for all, established and newly founded studios alike, and the barrier of entry was comparatively low, notably without the excessive amount of legal clamping and capital constraints we see today. The real difference manifested itself in the different level of skill mastery each team had, which in turn encouraged a highly competitive spirit to use technology ingeniously.

Back then the studios tended to treat gamers as a unique group of people who looked for a special mix of fun, challenge and excitement. From a player's point of view the use of technology mattered. Not only because it enables interactivity in computer games. It adds another dimension to the experience. To interact with a controller, to experience immediately what elements are responsive to your actions and how you can drive what happens on the screen makes things feel tangible. This helps to create the feedback loop that lets you intuitively DO things and feel as if you are a part of the animated game world. It makes a huge difference whether you are interacting through a streamlined interface or have to do tedious tasks again and again or have your experience interrupted by bugs. It affects how the visual and aural stimuli enhance and capture your imagination. Anyone who plays games can tell the difference. Once you have played a few games you appreciate the games that do things really well and the achievement of their developers. More importantly you are aware that the state of the art is constantly evolving and that there is much much more to be had. You naturally develop an expectation towards what the next games could be, an expectation that drives the industry.

Considering how things were in the early 90's video games have come a long way. But have they left childhood yet? With success stories like Tomb Raider and Warcraft in the mid-90's more money began to be pouring into the games industry and productions tended to get bigger and more ambitious. At the end of the 90's the funds available and the ingenuity of people inspired by the achievements of the video games industry had led to titles such as Relic's Homeworld (1999), Ionstorm's Deus Ex (2000), Gremlin's Soulbringer (2000), Bioware's Baldur's Gate (1998), SimTex's Master of Orion II (1996) and Black Isle's Planescape Torment (1999). Each of these titles did so many things right and created a depth of gameplay that hasn't been surpassed since then. Essentially all the ingredients we love in modern video games were already present then; the flawless presentation of a real-time strategy masterpiece of truly epic proportions by Relic; the rich atmosphere, seamlessly integrated cinematic cutscenes, intriguing magic system and traps of Soulbringer; Deus Ex's blend of conspiracy thriller and original locations interactively accessible through a wealth of interactive objects and playing styles; the subtle characters, living artefacts and unique quest revolving around ethics and identity in Torment; and in general a wealth of other things like motion capture, excellent voice acting and real-time lighting. There was no question that these games delivered a gripping experience. This was complemented by developer diaries on the web and an increasing number of good interviews that provided a treat of first hand information that helped to build a community who anticipated new releases.

What followed was what I call the dead money trail of the video games industry. Increasing pressure by the publishers led the studios to adopt production processes not unlike those of the film industry. With very tight schedules, creativity took a backseat and the main focus was on execution; tremendous effort was invested in refining the look & feel, the artwork, polishing the graphics and making the gameplay accessible to the mass market. Everything IS focused on the experience the player lives through while interacting with the game world. It is true that this is the most important aspect of a video game and you cannot fault the studios for that. But there is one thing amiss here. These productions tend to deliver the experience, not necessarily linear, but within a framework similar to a movie, using interactivity as a mere means, although the interactive media's greatest strength is to evoke a much more multidimensional experience. Consequently, sadly, many titles turned out to deliver an experience worthy of a movie but were somehow still unsatisfactory, and often you could not pinpoint why.

Some months ago I showed the sons of my cousin, age 11 and 7, 3DO's Heroes of Might and Magic III, released in 1999. This particular installment has a very user-friendly interface and probably the best enemy AI ever devised for a turn-based strategy game, that makes playing against it a pleasure and adds a huge replay value. The kids took great pleasure in observing, inquiring how things work and understanding the goals of the gameplay. They also loved to watch me playing and to ask why I do this and that. And to my surprise - Heroes III centers on conquest and competition - they started out playing without second thought cooperatively, giving each other and their friends, when they had joined them later, helpful tips and hints how to do things better. The appeal of the game holds for months now and they divide the land fairly among them.

Similar to how kids explore how things work on a basic level, adults and teenagers love to experiment and explore their world. It's only their desires are more mature, focusing on their feelings, attitudes, achievements and goals. Although game worlds are a product of fiction, the experience while playing is real. It is a unique blend of the emotions triggered by the game and the desires and associations the player has. What sets video games apart from traditional media like film and book is its ability to actively participate in and shape the events stored on the medium. Interactivity adds depth in two ways: first, it lets us actually make choices; and second, it allows us to explore much more deeply the way we make choices or break new ground. It is the second aspect that leads to a satisfying game experience for the more mature mind. Video games feature cause-and-effect that provides feedback to gauge the impact of actions. We experience first hand the result of our choices, our emotions associated with it and learn whether we are satisfied with the outcome or would like to try differently the next time. Albeit there is a huge market for games that explore this potential constructively, very few games actually do. This is one reason that The Sims is outrageously successful. It isn't so much a question to which genre a title belongs, but more an issue whether it captures successfully the spirit of exploration. Games such as Knights of the Old Republic, in which interactivity and story elements are crafted well together, or the appeal of The Longest Journey and Will Wright's Spore prove that.

Ultimately interactivity provides us with a degree of freedom. In video games this degree is at the same time more and also less than in our mundane world. We don't have the constraints of the roles and social context we are used to or the same fear of mortal danger. But we are very much constricted by the decisions the game designers make and the limits the technology places on them. This contrast, between technical limitations on the one hand and our desires on the other, defines very much the way video games evolve. Interactivity is the key and its status as the most powerful enabler must be properly reflected in the production process and technology used therein. The current split between artwork and code cannot be sustained for long and is detrimental as well on production costs as on the experience created. In many titles interactivity increasingly has become a chore in order to advance through the plot. What we desire is a fresh experience and more freedom. Freedom can have many faces. Freedom from conflict, freedom to choose, freedom from distractions, freedom from bugs, freedom to fight for what we believe, or the freedom to embark on an adventure, a quest, or to explore what we can achieve. Video games are about making things possible.

INVESTOR RELATIONS

Xynesis is a spin-off and financed by SSRC. They have established a fond that finances Xynesis and offers investors a share of the royalties received from licensing Xynesis' technology.

CONTACT INFORMATION

XYNESIS TECHNOLOGY
c/o Simulation Systems Research Corporation Ltd
Venusberg 16
20459 Hamburg
Germany

Phone: +49.40.2190.6636
Fax: +49.40.3441.31
E-mail: inquiries[at]simsysresearch.com

BIO

Stefan Gollasch - I was born in 1965 in Germany and as a child I was always interested to learn how things worked and to understand the reasons and mechanics behind it. As a fourteen year old I was fascinated by Albert Einstein's theory of relativity and the idea of finding a set of universal laws that describe the processes in our world relatively simple, and particularly by Einstein's mind who left known patterns of thought and envisioned fundamental structure beyond matter and forces. At school I was among the first pupils being taught programming computers and it was immediately clear to me that this technology had nearly unlimited potential, that we effectively had a machine that would execute any amount of instructions that we could think of. A few years later at university I learned to address the engineering aspects of computer science - and about the limitations of the technology.

While I wholeheartily embraced the solid theoretical foundation and practical skills taught at university, I felt that its research and curricula lagged substantially behind the technologically fast progressing industry. I realized that the actual shape of the technology was defined by the people who developed innovative solutions that pushed the edge. I couldn't help but to be affected by this spirit and followed the developments closely, with a particular focus on processing architectures, operating systems and programming methodologies. Naturally, my own vision of where the path ultimately might lead developed, and driven by my interest in physics - I had devoted a good deal of my time over the years to truly understand Einstein's ideas - I began to research systematically the concurrent processes apparent in nature and the lessons that could be learned by applying the same principles to the design of processing architectures and applications.

When I learned of an advanced transputer design, called the T9000, in 1991, I left university to found my first company that specialized on the simulation of physical chemistry. The transputer was an invention of the English company Inmos specifically designed to build huge multi-processor systems. Its latest iteration, the T9000, was on par with the i860 from Intel, had on-chip support for high-speed links that channeled data between processors and a number of other features that pushed the edge in parallel computing. Back then it promised adaptable and cheap super-computer processing grids, very much alike what Sony is aiming at with its distributed processing concept for the PS3 today. With this hardware, my research on parallel designs and my knowledge of fundamental physics, I was certain that I could develop a parallel OS ideally suited to a distributed physics simulation that featured a high volume of interactions. My goal was to provide a solution that could predict chemical reactions under varying physical conditions, including zero gravity, temperatures near absolute zero, strong electromagnetic fields and extreme pressures. The simulator should visually display the reactions and provide researchers and engineers with insights that would make many laboratory experiments redundant. The work on this machine continued until well into 1993.

Unfortunately, the T9000 was killed by Inmos' parent company SGS Thomson in 1993. In a spate of corporate politics they decided to build DSPs in Inmos' fabs instead. At this time it was clear to me that desktop systems were constantly growing in power with ongoing miniaturisation - what until then had required a graphics workstation did fit now on a graphics card - and that the markets for consumers, entertainment and simulation will inevitably converge. The concepts and technologies I had developed for the simulator were shelved. I reverted back to being a researcher who developed enabling technologies.

In 1998 I developed software for a very large German engineering company who builds everything ranging from mobile phones over railway control systems to power plants. While this provided me valuable insights into how a large multinational organizes itself, I somehow felt constrained by the corporate politics that dictated how to do things and all too often in an illogical way. During 1999 and part of 2000 I worked through a consultancy for one of the directors of another German multinational. My task was to evaluate the documents that were provided by more than 150 successful German and multinational corporations for a management study into learning organisations. This turned out to be a stroke of luck. It appeared that many of the top executives from different companies did know each other personally which significantly affected the quality of the materials provided. These were largely confidential and provided excellent insights into corporate strategies usually not available to outsiders. I was able to witness the state of the art as well as the strategic planning of human resource development, the deployment of IT throughout an organisation and which role it had in qualifying and enabling the workforce and managers. On the other hand I was entirely free to spend half my time to continue working on the technologies I envisioned.

In 1999 the direction in which I was researching was profoundly impacted by The Matrix of the Wachowski Brothers. They had delivered a genius strike, not only by the movie's story but by defining reality anew. They brilliantly merged virtual reality with physics and showed us that the choices we make and the quantum field are two sides of one coin, and that it doesn't matter whether the underlying rules are of a digital or physical nature. In this light the question, whether our choices determine the quantum field or not, becomes one with belief. From this the tiny digital building blocks, that could animate everything and are the basis of QR and Quantum Ringcode, emerged.

The two main challenges I faced were to make these digital building blocks adaptable to any context and to animate complex scenes built from them in real time. Alex Stepanov, the creator of C++'s STL, once said that "The reason that data structures and algorithms can work together seamlessly is ... that they do not know anything about each other." QR evolved around a similar paradigm and is today a general purpose language that layers structure over objects. Each object could have inner structure that addresses the next layer of objects. To complement this, I developed Quantum Ringcode that implements the building blocks' generic functionality. This implementation needed to be scalable to object structures of higher complexity without loss of efficiency and should ideally give high-level objects distinguishable qualities solely based on their layered architecture.

In 2003 QR and Quantum Ringcode had matured to the point that its use in video games middleware was feasible. In the autumn of this year I ventured to Vancouver to present the technology and the gains in productivity it would bring to the development studios there. However, it turned out that the approach I had choosen, to present the technology in terms of productivity gains, was a very tough sell. How do you prove productivity? Subsequently, in 2004, I focused on making the enabling technologies accessible separately. These are the programming language QR and the Natural Shape Approximation that I developed originally to render efficiently and incrementally the layered object structures characteristic of QR.

* * *

There is truth in all.
No power in the universe would prevent us being one.