Why does a server use more RAM over time?

A server's increasing RAM consumption is not necessarily due to memory leaks.
A more likely reason is that the server's software is using more RAM to cache its data over time.
The server software does this with good intentions: it notices that there is free RAM and tries to cache the data that users of that software access most often.
This behavior is especially typical for software that deals with large amounts of data, such as DBMS and search tools for those DBMS (such as the OpenSearch/Elasticsearch).