Jump to content

Which is the best server, that has been advertised here, and accepted for the topic?  

59 members have voted

  1. 1. Which is the best server, that has been advertised here, and accepted for the topic?

    • Lineage][ Staging
      7
    • L2Mafia
      26
    • L2Headoff
      2
    • L2Justice
      9
    • L2Crysis
      15


Recommended Posts

Posted

As i see on L2Headoff site it says there are 2 Online in x7 and 3 online on x300...

 

Why is this server in the poll with all the epic servers (well some of them are epic) :P ?

Posted

As i see on L2Headoff site it says there are 2 Online in x7 and 3 online on x300...

 

Why is this server in the poll with all the epic servers (well some of them are epic) :P ?

 

Because the administrator made an application.

I had no reason to disaprove it.

Posted

Sorry for double posting (1 day has passed and this is stickied, which means that I didn't bump it).

 

First of all, I would like to wish you a happy new month.

It's December the 1st today. This means that all the votes have gone back to ZERO (0) and you can start voting again!

 

I wish you the best.

 

Best Regards,

Coyote™

Posted

Whose of Hellbound servers has no stupid custom?

 

Please, do not talk for offtopic stuff in my topic.

If you want a Hellbound server without customs, request one at the proper section.

Thanks.

Guest
This topic is now closed to further replies.



  • Posts

    • And Discord: https://discord.gg/3aYqWNqb
    • Ofc: https://discord.gg/3aYqWNqb
    • You can find some H5 skins shared in old L2 modding Discords, but most of the higher‑quality ones are either paid or come bundled with full client edits. I usually mix in commissioned work and whatever I can patch myself. On a side note, I fund a lot of these commissions by selling off game items through instant sell cs2 skins, which has been a quick way for me to get some cash for projects.
    • There is no need for gRPC in this case, even tho originally it was gRPC based but since we don't need it to be bi-directional, we switched to simple http requests for the web calls and SSEs for the data streamed from the server. There are distributed locks in place to precent race conditions between actions that can happen between multiple web instances and the server.   Local models can also be slow depending on the model, and most external models can actually be faster than local ones if you use Flash 2.5 or something along those lines. I am running on 512GB of Unified Memory on my Mac Studio M3 Ultra so the speed of the local model for a small model is pretty good but I tested it with Gemini too and it works equally as fast and in some cases faster. The way it works is that I'm using pgvector (one of the benefits of moving to Postgres) to search the data and see what the player can see etc and there is some batching of the next few actions for 2-4 seconds for the user until the next LLM request fires. The batching also includes branching on logic so if they for example fall under some HP they will move to kiting instead of attacking or maybe they heal etc.   Everything is authed and permission-based. The server and the backend of the frontend have secure communication between them, either with a symmetric key (not recommended for production) or a certificate (the recommended way), so there is no worry. It's all tied to the account's access level, etc., so nobody can make an action that they normally wouldn't be allowed to do. Even the MCP is token-based, and there are prompt injection protections in place. The MCP is audited, and every mutation needs confirmation. The admin area is only accessible to the admin account anyway so normal users can't access it.  
  • Topics

×
×
  • Create New...

Important Information

This community uses essential cookies to function properly. Non-essential cookies and third-party services are used only with your consent. Read our Privacy Policy and We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue..