Thu Jan 2 11:35:15 PST 2014
- Previous message: [Slony1-general] testing slony state in slony 2.0.4
- Next message: [Slony1-general] 6 hours to replicate single table, 12 hours to replicate DB
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
Wondering what settings I need to speed this up. To do a rebuild of a db it takes a long time, 6 hours for a singe table. No I/O issues, no load, just slon postgres taking their sweet old time. I would like to use the resources available to speed this up. The table is 2013-12-21 19:17:58 PST CONFIG remoteWorkerThread_1: Begin COPY of table "impressions" 2013-12-21 19:37:03 PST CONFIG remoteWorkerThread_1: 12657163552 bytes copied for table ”impressions” 2013-12-22 01:40:22 PST CONFIG remoteWorkerThread_1: 22944.144 seconds to copy table ”impressions” <— 6 hours Postgres 9.2.4 slony 2.1.3 This is a larger table, but because of bloat etc, we need to do ground ups to clean it out every so often (Vacuums don't do it). Slony config , pretty much at default other than sync interval. # Check for updates at least this often in milliseconds. # Range: [10-60000], default 2000 sync_interval=1000 #sync_interval_timeout=10000 # apply every single SYNC by itself. # Range: [0,100], default: 6 #sync_group_maxsize=6 #sync_max_rowsize=8192 #sync_max_largemem=5242880 I either need some advanced settings for when we are doing a rebuild, to speed up the process, or I need to do some configurations that stay during normal workloads as well. But normal workloads things are replicated and keep in sync, it's just the rebuild portion. I would like to see it actually stressing my boxen :) Thanks Tory -------------- next part -------------- An HTML attachment was scrubbed... URL: http://lists.slony.info/pipermail/slony1-general/attachments/20140102/4b4c7697/attachment.htm
- Previous message: [Slony1-general] testing slony state in slony 2.0.4
- Next message: [Slony1-general] 6 hours to replicate single table, 12 hours to replicate DB
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
More information about the Slony1-general mailing list