Phaser 3 & 4 Dev Log - June 2021
Things have been really busy here over the past month and there is a lot of ground to cover in this Dev Log, so let's jump right in.
Farewell, Francisco
First of all, I'd like to say thank you to Francisco (aka Gammafp in Discord) who joined the Phaser team back in December. He helped with the release of Phaser 3.50, fixed hundreds of examples, built a brand new docs system, and also created a solid unit test workflow for Phaser 4. His time with Phaser has now ended and we wish him all the best in his new role building Phaser games in the crypto world.
UK Games Fund
Every year the UK Games Fund runs a couple of rounds of funding, where eligible UK companies can apply and, if successful, receive funding towards their projects. I've never tried this before because traditionally they've only ever funded games, but after a brief exchange with the UKGF team, they encouraged my application.
So, I've spent some time this month working through the different stages of this process. Last week I submitted our final pitch video and I should hear by the end of July if we've been short-listed. Should they like the pitch you then get called to an interview and if you get through that, the funding is yours.
I've no idea if we'll succeed, or not. Competition is fierce and, I believe, even more, intense this year than in previous years. I've done everything I could, so right now it's a case of fingers crossed and wait and see. However, should we be lucky enough to receive funding I will be using that money directly to hire some extra staff into the team to speed things up.
Phaser 3.55
Since the last Dev Log, there have been a couple of point releases of Phaser 3.55. These fixed issues with filled Shape Game Objects (like Arc, Ellipse, and Triangle) not rendering properly in WebGL, as well as a couple of smaller updates. If you're using 3.55.0 please upgrade.
I'm now working on Phaser 3.56. This release should (hopefully) fix the issue with blue artifacts appearing when rendering in WebGL on certain flavors of Android. It will also introduce compressed texture support, which I know some of you have been waiting eagerly for. Keep your eyes peeled on the Phaser Discord or GitHub to see when this release is ready.
Phaser 4
My main focus over the past few months has been Phaser 4 and it has undergone some massive updates in that time. This includes an internal restructuring to use an Entity Component System via bitECS, a brand new set of debugging tools, and a new way to draw called Direct Mode.
Due to the size of each of these changes I'm going to split this Dev Log into 2 parts. In one log I'll cover Direct Mode and the Debug Panel and in the next installment, I'll cover the use of ECS within Phaser and how you can use it, too.
Before I get started I will re-iterate the usual warnings: Phaser 4 is not production-ready and lots of the API is currently not implemented. What I write about in this Dev Log may, or may not, be outdated very quickly. Finally, yes, you are absolutely encouraged to get involved and have a play with it. Please see the April 2021 Dev Log for extensive details on doing exactly that.
Now, onto the good stuff...
Phaser 4 Direct Mode
One thing that I really wanted to make sure was possible in Phaser 4 is the ability to easily draw directly to the canvas, even in WebGL mode.
I genuinely feel this is something missing from lots of game frameworks. If you want, for example, to display an image on the canvas you have to create a suitable Game Object and add that to the display list, first. This is all good and well but very often it's useful to be able to render things under your terms. This is where Direct Mode comes into Phaser 4. Let me demonstrate with a small example:
Here, you can see we've loaded an image. There are a couple of things worth mentioning about even this part: First, note how you can now 'await' on that load? Secondly, note how we're not actually using the Loader at all! Because all we need, in this instance, is a single image, we can just invoke the 'ImageFile' function directly and call its 'load' function, which returns a Promise we can wait on. Very handy :) but I've digressed a little.
To draw the image all we need is a reference to the texture, which we can get from the Texture Manager via the 'GetTexture' function. We then create a World, which is the only requirement of Direct Mode, and hook into its Post Render Event. This event is given what is called the Render Pass. The Render Pass is generated internally every frame and can be thought of as a live stack of the WebGL state. You can use this to basically do anything that the renderer is capable of.
In the example above I've imported the 'DrawImage' function from 'renderer/webgl1/draw/DrawImage'. This function works by taking a reference to the Render Pass, which we get from the event arguments, and then all it needs is a texture and the coordinates to draw it at. As you'd expect, running it gives us this:
So far, so simple. Let's take it up a notch. The 'DrawImage' function also has optional alpha and scaling parameters, so you can scale an image by passing these floats to the function:
Here we scale the image by 4 on each axis:
We can also draw frames from textures using the 'DrawFrame' function. This works in exactly the same way as DrawImage but there is an additional 'frame' parameter:
Here we draw 6 frames from a loaded sprite sheet:
It's important to understand that even though we're calling DrawFrame 6 times that doesn't equate to 6 separate draw calls in WebGL. It's still going to be batched internally, so the above is actually only a single draw call.
As with DrawImage, DrawFrame has the ability to set the alpha and scale of the frames, too:
We've loaded a 32x32 sprite sheet in this example so are scaling each frame by 9 horizontally and vertically:
If you've done any work with the Canvas 2d API before you're probably familiar with the drawImage function. One of its options is the ability for you to draw only part of an image and resize it during rendering. This is also available in v4 via the 'DrawImagePart' function:
This works in exactly the same way as in 2d canvas but is running batched under WebGL:
You can run this example here (it's better when moving)
The function allows you to specify the x, y, width, and height to copy from the texture and then the destination x, y, width and height when drawing it. The destination width and height do not need to be the same as the texture, allowing you to scale during the draw, as in the demo above.
It's not all just drawing images. How about if you don't even need a texture and just want to draw a filled rectangle? Check out this compact little example:
You can run this example here.
Let's bump it up a notch and draw lots of rectangles:
You can run this example here (and it looks a lot better moving, too!)
Not the most thrilling of examples, I admit, but I'm sure there are a number of you now appreciating the flexibility of this approach. As with textures, the above is of course batched, too.
Let's mix the two things we've got here together, using FillRect to draw a nice gradient sky effect and DrawFrame to draw some tiles. I'm going to show the full code of this demo below, so you can appreciate just how tiny it is, even with the loading of the map data:
And here it is in action:
It's much better to view the example to see this running :)
How many draw calls?
1, of course :)
Because Phaser 4 is entirely modular all of these functions, such as FillRect and DrawFrame, all exist internally already. It's just a slightly different way of calling the API so you can do whatever you like with the Render Pass.
I know a number of devs who genuinely struggle with the object-orientated approach enforced by lots of libs, Phaser included. Not because they can't grasp the concepts, but because it usually always feels like you're being made to bend to the will of the API, rather than being able to tell it exactly what it is you want to do: "Draw this, right there, and do it quickly".
By exposing the renderer functions Phaser 4 already contains, you are fully able to use the API in this way, never needing to create a Game Object class if you don't wish to. The full render flow is under your control, to use as you see fit. It's pretty damn tiny, too. Most of the examples above don't even exceed 20 KB (min/gz).
As the Render Pass functionality expands, so too will the Direct Mode functions. At the moment you can draw images, frames, rectangles, triangles, lines, and quads with any vertices. Which honestly, is enough for the vast majority of games. Eventually, everything the renderer does, you will be able to do too, including shaders, masks, and more.
New Debugging Tools
For a long time, I've wanted to have tools that would allow me to monitor what is going on in a Phaser Scene, as well as debug it more quickly. As I was recoding the World render flow it made a lot of sense to create these tools to assist me. So now I present to you, the Phaser 4 Debug Panel:
The easiest way to see what it can do is to experience it.
Start by opening this example.
Move the windows around so you can see what you're doing and then have a play. If you click anywhere on the left of the demo it will spawn a random Sprite. If you click on the right of the demo it'll spawn a random Sprite that is set to rotate every frame.
Spawn a few sprites, like in the screenshot above and you'll notice the graphs begin to reflect this.
The FPS and MS graphs should be self-explanatory. It's the current estimated frame rate and how many ms it took to process and render the last frame. The Performance gauge in the bottom middle reflects the ability for the MS rate to stay within a sane value. You'll see the more rotating sprites you add, the harder it has to work to process them all.
The Transform graphs show how many transforms had to be recalculated in that frame. You'll notice that the rotating sprites cause this to increase, but static ones do not.
If you click on the pause icon in the debug panel you can then get a much better view of the graphs:
Hovering over the graph allows you to see the value at exactly that game frame. You can also drag and select a range of the graph to zoom into it:
This allows you to analyze the data points more clearly. Double-click the graph to zoom back out again.
Clicking the 'Game Objects' link at the top of the debug panel will switch the view to this:
Here you can see a list of all Game Objects that were rendered in this frame. Clicking one of them opens the inspection panels where you can view and adjust all of their transform and display properties. For example, here I've selected the Lemming sprite and you can see the ability to adjust its rotation, tint, texture, and more.
Here is a video showing me using this tool in more depth.
Kill your Display Bugs with DDT!
As well as this handy panel I have also created what I've internally named DDT: The Display Debug Tools. Rather than a visual tool, this one is installed into Dev Tools and allows you complete access to every display-related feature that Phaser 4 contains. This is extremely handy for quick debugging and testing.
To see it in action view this example and then press F12 to open your Dev Tools. You'll see that DDT is installed:
There are 2 help commands: DDHelp() and DDCommands() which will give you the commands you have available, you can see this in the screenshot above.
Let's try it out. First, issue the following commands:
Now you'll see two Sprites appear, just as you've created them:
You can manipulate them directly, just like if you were doing it via code, by accessing their properties:
Every single display list command is available to you. You can sprites, add them to other Sprites or Containers as children, manipulate them, move them between parents, shuffle them, destroy them - basically anything Phaser can do, you can do from the command line without having to worry about building a test example and rebuilding and running it each time.
One really useful command is List(), it'll display the display tree in a full parent-child hierarchy and you can expand and collapse each node:
As you can see, this is really handy!
Both of these tools are, of course, fully optional. If you don't want them, you don't have to use them. Neither of them will take up any space in your bundle if you don't import them, which is exactly the way it should be.
I hope you've got a better idea of some of the things I've been doing with Phaser 4 recently. In the next Dev Log I'll cover the new Entity Component System, so keep your eyes peeled :)
Things have been really busy here over the past month and there is a lot of ground to cover in this Dev Log, so let's jump right in.
Farewell, Francisco
First of all, I'd like to say thank you to Francisco (aka Gammafp in Discord) who joined the Phaser team back in December. He helped with the release of Phaser 3.50, fixed hundreds of examples, built a brand new docs system, and also created a solid unit test workflow for Phaser 4. His time with Phaser has now ended and we wish him all the best in his new role building Phaser games in the crypto world.
UK Games Fund
Every year the UK Games Fund runs a couple of rounds of funding, where eligible UK companies can apply and, if successful, receive funding towards their projects. I've never tried this before because traditionally they've only ever funded games, but after a brief exchange with the UKGF team, they encouraged my application.
So, I've spent some time this month working through the different stages of this process. Last week I submitted our final pitch video and I should hear by the end of July if we've been short-listed. Should they like the pitch you then get called to an interview and if you get through that, the funding is yours.
I've no idea if we'll succeed, or not. Competition is fierce and, I believe, even more, intense this year than in previous years. I've done everything I could, so right now it's a case of fingers crossed and wait and see. However, should we be lucky enough to receive funding I will be using that money directly to hire some extra staff into the team to speed things up.
Phaser 3.55
Since the last Dev Log, there have been a couple of point releases of Phaser 3.55. These fixed issues with filled Shape Game Objects (like Arc, Ellipse, and Triangle) not rendering properly in WebGL, as well as a couple of smaller updates. If you're using 3.55.0 please upgrade.
I'm now working on Phaser 3.56. This release should (hopefully) fix the issue with blue artifacts appearing when rendering in WebGL on certain flavors of Android. It will also introduce compressed texture support, which I know some of you have been waiting eagerly for. Keep your eyes peeled on the Phaser Discord or GitHub to see when this release is ready.
Phaser 4
My main focus over the past few months has been Phaser 4 and it has undergone some massive updates in that time. This includes an internal restructuring to use an Entity Component System via bitECS, a brand new set of debugging tools, and a new way to draw called Direct Mode.
Due to the size of each of these changes I'm going to split this Dev Log into 2 parts. In one log I'll cover Direct Mode and the Debug Panel and in the next installment, I'll cover the use of ECS within Phaser and how you can use it, too.
Before I get started I will re-iterate the usual warnings: Phaser 4 is not production-ready and lots of the API is currently not implemented. What I write about in this Dev Log may, or may not, be outdated very quickly. Finally, yes, you are absolutely encouraged to get involved and have a play with it. Please see the April 2021 Dev Log for extensive details on doing exactly that.
Now, onto the good stuff...
Phaser 4 Direct Mode
One thing that I really wanted to make sure was possible in Phaser 4 is the ability to easily draw directly to the canvas, even in WebGL mode.
I genuinely feel this is something missing from lots of game frameworks. If you want, for example, to display an image on the canvas you have to create a suitable Game Object and add that to the display list, first. This is all good and well but very often it's useful to be able to render things under your terms. This is where Direct Mode comes into Phaser 4. Let me demonstrate with a small example:
Here, you can see we've loaded an image. There are a couple of things worth mentioning about even this part: First, note how you can now 'await' on that load? Secondly, note how we're not actually using the Loader at all! Because all we need, in this instance, is a single image, we can just invoke the 'ImageFile' function directly and call its 'load' function, which returns a Promise we can wait on. Very handy :) but I've digressed a little.
To draw the image all we need is a reference to the texture, which we can get from the Texture Manager via the 'GetTexture' function. We then create a World, which is the only requirement of Direct Mode, and hook into its Post Render Event. This event is given what is called the Render Pass. The Render Pass is generated internally every frame and can be thought of as a live stack of the WebGL state. You can use this to basically do anything that the renderer is capable of.
In the example above I've imported the 'DrawImage' function from 'renderer/webgl1/draw/DrawImage'. This function works by taking a reference to the Render Pass, which we get from the event arguments, and then all it needs is a texture and the coordinates to draw it at. As you'd expect, running it gives us this:
So far, so simple. Let's take it up a notch. The 'DrawImage' function also has optional alpha and scaling parameters, so you can scale an image by passing these floats to the function:
Here we scale the image by 4 on each axis:
We can also draw frames from textures using the 'DrawFrame' function. This works in exactly the same way as DrawImage but there is an additional 'frame' parameter:
Here we draw 6 frames from a loaded sprite sheet:
It's important to understand that even though we're calling DrawFrame 6 times that doesn't equate to 6 separate draw calls in WebGL. It's still going to be batched internally, so the above is actually only a single draw call.
As with DrawImage, DrawFrame has the ability to set the alpha and scale of the frames, too:
We've loaded a 32x32 sprite sheet in this example so are scaling each frame by 9 horizontally and vertically:
If you've done any work with the Canvas 2d API before you're probably familiar with the drawImage function. One of its options is the ability for you to draw only part of an image and resize it during rendering. This is also available in v4 via the 'DrawImagePart' function:
This works in exactly the same way as in 2d canvas but is running batched under WebGL:
You can run this example here (it's better when moving)
The function allows you to specify the x, y, width, and height to copy from the texture and then the destination x, y, width and height when drawing it. The destination width and height do not need to be the same as the texture, allowing you to scale during the draw, as in the demo above.
It's not all just drawing images. How about if you don't even need a texture and just want to draw a filled rectangle? Check out this compact little example:
You can run this example here.
Let's bump it up a notch and draw lots of rectangles:
You can run this example here (and it looks a lot better moving, too!)
Not the most thrilling of examples, I admit, but I'm sure there are a number of you now appreciating the flexibility of this approach. As with textures, the above is of course batched, too.
Let's mix the two things we've got here together, using FillRect to draw a nice gradient sky effect and DrawFrame to draw some tiles. I'm going to show the full code of this demo below, so you can appreciate just how tiny it is, even with the loading of the map data:
And here it is in action:
It's much better to view the example to see this running :)
How many draw calls?
1, of course :)
Because Phaser 4 is entirely modular all of these functions, such as FillRect and DrawFrame, all exist internally already. It's just a slightly different way of calling the API so you can do whatever you like with the Render Pass.
I know a number of devs who genuinely struggle with the object-orientated approach enforced by lots of libs, Phaser included. Not because they can't grasp the concepts, but because it usually always feels like you're being made to bend to the will of the API, rather than being able to tell it exactly what it is you want to do: "Draw this, right there, and do it quickly".
By exposing the renderer functions Phaser 4 already contains, you are fully able to use the API in this way, never needing to create a Game Object class if you don't wish to. The full render flow is under your control, to use as you see fit. It's pretty damn tiny, too. Most of the examples above don't even exceed 20 KB (min/gz).
As the Render Pass functionality expands, so too will the Direct Mode functions. At the moment you can draw images, frames, rectangles, triangles, lines, and quads with any vertices. Which honestly, is enough for the vast majority of games. Eventually, everything the renderer does, you will be able to do too, including shaders, masks, and more.
New Debugging Tools
For a long time, I've wanted to have tools that would allow me to monitor what is going on in a Phaser Scene, as well as debug it more quickly. As I was recoding the World render flow it made a lot of sense to create these tools to assist me. So now I present to you, the Phaser 4 Debug Panel:
The easiest way to see what it can do is to experience it.
Start by opening this example.
Move the windows around so you can see what you're doing and then have a play. If you click anywhere on the left of the demo it will spawn a random Sprite. If you click on the right of the demo it'll spawn a random Sprite that is set to rotate every frame.
Spawn a few sprites, like in the screenshot above and you'll notice the graphs begin to reflect this.
The FPS and MS graphs should be self-explanatory. It's the current estimated frame rate and how many ms it took to process and render the last frame. The Performance gauge in the bottom middle reflects the ability for the MS rate to stay within a sane value. You'll see the more rotating sprites you add, the harder it has to work to process them all.
The Transform graphs show how many transforms had to be recalculated in that frame. You'll notice that the rotating sprites cause this to increase, but static ones do not.
If you click on the pause icon in the debug panel you can then get a much better view of the graphs:
Hovering over the graph allows you to see the value at exactly that game frame. You can also drag and select a range of the graph to zoom into it:
This allows you to analyze the data points more clearly. Double-click the graph to zoom back out again.
Clicking the 'Game Objects' link at the top of the debug panel will switch the view to this:
Here you can see a list of all Game Objects that were rendered in this frame. Clicking one of them opens the inspection panels where you can view and adjust all of their transform and display properties. For example, here I've selected the Lemming sprite and you can see the ability to adjust its rotation, tint, texture, and more.
Here is a video showing me using this tool in more depth.
Kill your Display Bugs with DDT!
As well as this handy panel I have also created what I've internally named DDT: The Display Debug Tools. Rather than a visual tool, this one is installed into Dev Tools and allows you complete access to every display-related feature that Phaser 4 contains. This is extremely handy for quick debugging and testing.
To see it in action view this example and then press F12 to open your Dev Tools. You'll see that DDT is installed:
There are 2 help commands: DDHelp() and DDCommands() which will give you the commands you have available, you can see this in the screenshot above.
Let's try it out. First, issue the following commands:
Now you'll see two Sprites appear, just as you've created them:
You can manipulate them directly, just like if you were doing it via code, by accessing their properties:
Every single display list command is available to you. You can sprites, add them to other Sprites or Containers as children, manipulate them, move them between parents, shuffle them, destroy them - basically anything Phaser can do, you can do from the command line without having to worry about building a test example and rebuilding and running it each time.
One really useful command is List(), it'll display the display tree in a full parent-child hierarchy and you can expand and collapse each node:
As you can see, this is really handy!
Both of these tools are, of course, fully optional. If you don't want them, you don't have to use them. Neither of them will take up any space in your bundle if you don't import them, which is exactly the way it should be.
I hope you've got a better idea of some of the things I've been doing with Phaser 4 recently. In the next Dev Log I'll cover the new Entity Component System, so keep your eyes peeled :)