I think it is common knowledge that when you watch a movie, or any other video, you are not actually looking at moving imagery. Instead, you are looking at static images that are played fast enough that our brains are tricked into thinking there is movement, and we're smart enough to fill in the gaps ourselves.
Games aren't very different. Whereas movies will have all the static images already ready to go, and just stored in digital or analog format, games have to render these static images on-the-fly to make them interactive. In a small fraction of a second, a device's graphics card produces a static image of the current game state, and projects it on the screen. A common frequency for this is sixty times per second, often denoted 60 FPS. FPS stands for frames per second. "Frame" is just the word programmers use to say "static image that makes up the moving display". This number, 60, is called the framerate.
The rate at which these images are updated really matters. Movies are generally shown at 24 FPS, which is clearly enough to trick us. Yet, when you play a game at 24 FPS (or 30 FPS, which is quite a common framerate), you will find that the game doesn't feel very smooth. Firstly, there is actually a difference - I do recommend looking at some 30 vs 60 FPS videos on the internet to find out for yourself. However, a second problem is the delay between seeing something on the screen, responding to it by using your keyboard or controller, and seeing the result back on the screen. For fast-paced games, the difference is very noticeable. If not consciously, then unconsciously.
So far so good, and most gamers probably know this much. What may be less obvious, is that updating the game also happens in small time steps. Many times a second, the game will look at the current state of the game, do some math and logic, and calculate the new state of the game. The frequency at which this is done is often tied to the framerate, so a game will often update, render, update, render, update, ...
I've mentioned 30 and 60 FPS several times so far, and that is with good reason. They are often chosen as the framerates for games. Let's say our game updates and renders at 60 FPS. This make it really convenient, because each time we do an update, we know that the same amount of time has passed as between previous updates. It is also exactly the same amount of time that passes between updates on a completely different computer (perhaps much more powerful). This simplifies programming a lot. We can just say: every frame, move the player five pixels to the left. This will result in the player moving to the left at a constant speed.
This all sounds perfect, but the reality is often far from perfect. Space Invaders, one of the oldest games, doesn't run at a fixed FPS. Instead, it just moves the aliens and draws them as fast as it can. The more aliens you kill, the less aliens the computer has to worry about, and thus the faster the aliens move! This is by the way how the difficulty curve for games was "invented".
Space Invaders show off a real problem though: the rendering can take a long time. You might already see it coming: on some computers, the rendering may take longer than 1/60th of a second. Suddenly, your game doesn't update and render sixty times per second, but only forty. As a consequence, your game runs a third slower than it is supposed to!
This is where variable framerates come in. Instead of making the assumption that every frame takes the exact same time on every machine, we can instead just measure the time since the last update. Instead of moving five pixels every frame, we make sure we move 300 (= 5 pixels multiplied by 60 FPS) pixels every second by getting the elapsed time since last frame in seconds, and multiplying this speed by that time. That sounds easy, but you have to do this everywhere. If you forget it in even a single place, your entire logic is out of wack. Sometimes you don't even find out until after a long time, when running the game on a completely different machine, and at that point it's often hard to find out where you missed that extra factor in your multiplication.
Variable framerate is hard under the best of circumstances, but trying to move a game from a fixed framerate to a variable framerate, with all the technical debt
that you may have, is even harder. Switching from one fixed framerate to another is almost equally painful. So this... this is where the RuneScape tick system finally comes in.
RuneScape is a bit unlike other games. The game renders many times a second, and even updates many times a second locally. However, when it comes to the actual game logic, it's only updated once every 0.6 seconds. The game can't let you update everything locally, because if your computer decides whether you get a rare drop table drop or not, then you can easily spoof that and tell the server you get that perfect loot every single time. So all the important updates are done by the server, and the server simply can't handle updating the game 60 times per second. Even if it could, the sheer network traffic it would require to send you the information would be significant. If that was not a problem either, the network lag would still make updating that fast a bit of a waste, since you would hardly notice it.
This tick system, where the server moves the entire game (and that means everything in the entire world of Gielinor!) forward by exactly 0.6 seconds every time, has been fundamental to RuneScape's architecture from the very first day. If you're walking, the game will move you exactly one tile per tick. The logic has it so you can attack with a whip once every four ticks, at the same speed as you can fish. This assumption of the 0.6 second is baked into every piece of logic ever written in RuneScape. To change the tick length to 0.3 seconds (exactly half), everything suddenly needs to take twice as many ticks. Not only that, but suddenly you can be in the middle of moving from one tile to the next, which is not something the server ever had to handle before (the server "teleports" you to the next tile ever frame, as it were).
Removing the tick system is roughly equivalent between moving to a variable tickrate on the server, and means we no longer multiply all the durations, but have to somehow change the logic to be completely time-based. It is incredibly easy to miss one of the systems, and suddenly your entire game goes haywire.
Changing anything about the tick system means touching virtually every single system that exists in the game. Apart from it being a colossal task, any mistake made in the process is really hard to find, but can have big consequences on the final result. It is for this reason that it is unlikely we will see any big changes in even the next few years. Until then, all Jagex can do is become better at faking. Make the client predict parts of what happens on the next tick, and make the game more responsive and smooth as a consequence. But for those holding out for an end to tick system related frustrations: this is one of the most complex problems you can have in game development, and it is not one that is solved overnight, so you'll just have to deal with it for a while longer.