Hi everyone,
I am looking for PostgreSQL DBA for a fully remote position. It is full time and needs to be worked on a b2b contract.
Salary is up to €120k + bonus + 30 days paid leave.
Support for Short-Medium Term Future Plans:
• Migration of PostgreSQL version 14 to version 16.
• Migration of on-premise PostgreSQL to AWS.
• Partitioning, purging, and/or migration of large historical tables (billions of rows)
I apologise if this is against channel rules.
@garyos96 for more details
hello, I get out of memory error when creating posgis extension, but there is plenty of memory I think, pg10 postgis 2.4.4, 8 gb total ram, 2 gb shared buffer, swappines 0 , overcommit ratio 75 and overcommit memory 2, what to look you advice?
Читать полностью…Hello and welcome to this chat. Here we discuss PostgreSQL and provide best-effort voluntary support on Postgresql issues.
If you have any question, try asking kindly, including log files, error descriptions and perhaps what you're trying to do.
- 🚫 photos and screenshots
- 🚫 spam and offtopic
- ✅ questions with log files and clear explanation of what's gone wrong
⬇️ Press the button down here ⬇️
hello guys,
pls I wanna know how to export my explain query to plain text or json format, because I wanna have those and import it in explain.dalibo query tool...
I use /O query then export it, but I am unable to use it to the dalibo as I won't be able to copy them or have only the whole plan without the additional information it shows
Anyone help me how can we public a database in ubuntu and access it on other devices
Читать полностью…Hello guys,
we have set the necessary environment needed with EDB in postgreSQL.
Now, I want to have a consistent and best way solution to migrate my free postgreSQL database to the EDB environment. Does this require a specific plan and compatibility? Please I need the most efficient plan to do this migration from postgreSQL to edb postgreSQL environment. Thanks in advance
I want to get the load pattern on the database for a particular time period like how many insert , update, select query has executed in the database on each table for a specific time period
Читать полностью…Yes I have created a table with columns like inserted updated deleted fetched. Then I created a script for inserting data in this table using pg_stat_user_tables, and set this script in the cronjob every hour.
Then I executed the query to calculate the difference in 1 hour. It gives me the desired results but it is a very lengthy process as I have to set this for each table individually.
Hi all, is there any provision in Postgresql to know the number of transactions in all tables in a database for a particular time. Means can we check how many insert/update/delete/select queries executed in each table in a database for a specific time period?
Читать полностью…Can anyone confirm if someone has recently completed the PostgreSQL Associate Certification?
Читать полностью…Hello and welcome to this chat. Here we discuss PostgreSQL and provide best-effort voluntary support on Postgresql issues.
If you have any question, try asking kindly, including log files, error descriptions and perhaps what you're trying to do.
- 🚫 photos and screenshots
- 🚫 spam and offtopic
- ✅ questions with log files and clear explanation of what's gone wrong
⬇️ Press the button down here ⬇️
Definitive complete course:
https://www.postgresql.org/files/documentation/pdf/17/postgresql-17-A4.pdf
Actually I have rdp in which the database is stored locally and I want to access it via my windows
Читать полностью…I reckon there's a bunch of documents, webinars etc. to be found at EDB for *that* path 😏
Читать полностью…You could store & reset the table statistics on a regular basis.
Or use pgBadger with full statement logging.
Or write a patch 😏
Anyone here open to host the next #pgsqlphriday blogging event?
Details: https://www.pgsqlphriday.com/
Or do you have topic ideas?