I'm using php and the exec() function to run 2 MySQL script files one after another
The first file has approx 2k update statements (see below) in and is 114kb in size.
UPDATE table1 SET is_active = ...

As per my knowledge, i understand fixed page size as size of page which used for read and write data but i am not able to understand what does mean of disk based ?
I read these words from this paper ...

At high load conditions the server app stops responding to client because begin/commit statements are executing very slowly (some take 15 seconds). The queries are simple - insert, update two columns ...

I'm doing some queries on a PostGIS, and some of them take a really long time ( > 60 secs). My queries are like this, except querying different tables (like osm_placesbelow).
This query takes about ...

I want to understand why migrating data in a table with all VARCHAR(50) fields to a table with optimized smaller types caused the new table (containing 61,065,164 rows) to be 4.46 GB, which is larger ...

I've been experiencing massive slowdown on my server after launching a new website.
Using MONyog I can see that after an apache/mysql restart, the queries/second starts out nice and low for around 10 ...

Disclaimer: I am relatively new to PostgreSQL.
I'm wondering how to optimize a query that does 2 INNER JOINs. My scenario is fairly simple:
Select Posts with a photo (Posts.photo IS NOT NULL) and a ...

I have a table with 50 column and one of my friend has told me to split the table in 4 different table to save 12 column in each table. He proposed a design to have Master table and refer the PK of ...

I have a mysql database table table1 with a dozen columns in it. One column is a mediumtext column with lots of data in it and all the other columns are small (ints, timestamps, etc.). The presence ...

I'm working on a regular price comparison site with two exceptions. I want to show price history and also allow users to subscribe to a product with a specific size and color. As a result, I've ended ...

I have a scenario like i have a two DB, one resides in remote server and other resides local. Now i have to fetch the records from remote DB when new records are inserted on tables (in remote DB) and ...

I need to parse contents from an API route and insert into a MySQL database. The aim of parsing contents is to replicate the data from the API to my database table.
I need to check API response every ...

Setting
In a datawarehouse, I am joining a fact table to 20 dimensions. The fact table has 32 million rows and 30 columns. This is a temporary staging table so I don't have to deal with other users ...

This query takes around 2 seconds when the order by is included, without the order it takes less than a tenth of a second.
The field that is doing the sorting, qc_products.addedDate, is an index and ...

I have in production a MySQL table pageviews with 4M rows that records page views of users on posts. I need to know which posts a specific user have read, but this request takes up to 15 seconds to ...

I am working on table join order optimization.
While query processing, I want to get table join order generated by optimizer, and update it by using my own algorithm in PostgreSQL?
How can I update ...

I am working on a db optimization (planing for future project growth) and need some help.
Currently, every table is using uniqueidentifier as PK (clustered index) and we have high index fragmentation ...

I'm not very experienced with database query optimization. I've been reading through similar questions here and Postgres tuning articles online, but unfortunately I haven't had any luck. Here's what I ...

I have found, that copying to tmp table is taking most of the time of a slow query, that I wanted to optimize. I watched the output of 'profiling' in a terminal. This happens on a pics table with 6000 ...

Short Description
I have a database of special offers that bundle several items together. I want to be able to search for a list of specific items and get a combination of bundles that includes all ...