Tue Aug 28 12:19:24 PDT 2007
- Previous message: [Slony1-commit] slony1-engine/doc/adminguide schemadoc.xml
- Next message: [Slony1-commit] slony1-engine/doc/adminguide logshipping.sgml slon.sgml
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
Update of /home/cvsd/slony1/slony1-engine/doc/adminguide
In directory main.slony.info:/tmp/cvs-serv21311
Modified Files:
Tag: REL_1_2_STABLE
logshipping.sgml
Log Message:
Update docs for log shipping to reflect Jan's recent changes
Index: logshipping.sgml
===================================================================
RCS file: /home/cvsd/slony1/slony1-engine/doc/adminguide/logshipping.sgml,v
retrieving revision 1.16
retrieving revision 1.16.2.1
diff -C2 -d -r1.16 -r1.16.2.1
*** logshipping.sgml 2 Aug 2006 18:34:59 -0000 1.16
--- logshipping.sgml 28 Aug 2007 19:19:22 -0000 1.16.2.1
***************
*** 4,10 ****
<indexterm><primary>log shipping</primary></indexterm>
! <para> One of the new features for 1.1 is the ability to serialize the
! updates to go out into log files that can be kept in a spool
! directory.</para>
<para> The spool files could then be transferred via whatever means
--- 4,10 ----
<indexterm><primary>log shipping</primary></indexterm>
! <para> One of the new features for 1.1, that only really stabilized as
! of 1.2.11, is the ability to serialize the updates to go out into log
! files that can be kept in a spool directory.</para>
<para> The spool files could then be transferred via whatever means
***************
*** 33,37 ****
<para> This makes log shipping potentially useful even though you
! might not intend to actually create a log-shipped node.</para></listitem>
<listitem><para> This is a really slick scheme for building load for
--- 33,38 ----
<para> This makes log shipping potentially useful even though you
! might not intend to actually create a log-shipped
! node.</para></listitem>
<listitem><para> This is a really slick scheme for building load for
***************
*** 75,78 ****
--- 76,83 ----
<answer><para> Nothing special. So long as the archiving node remains
a subscriber, it will continue to generate logs.</para></answer>
+
+ <answer><para> <warning> If the archiving node becomes the
+ <emphasis>origin,</emphasis> on the other hand, it will
+ <emphasis>continue</emphasis> to generate logs.</para> </answer>
</qandaentry>
***************
*** 167,175 ****
coming from other nodes (notably the data provider). </para>
! <para> Unfortunately, the upshot of this is that when a node newly
! subscribes to a set, the log that actually contains the data is in a
! separate sequencing from the sequencing of the normal
! <command>SYNC</command> logs. Blindly loading these logs will throw
! things off :-(. </para>
</listitem>
--- 172,177 ----
coming from other nodes (notably the data provider). </para>
! <para> With revisions in sequencing of logs that took place in 1.2.11,
! this now presents no problem for the user.</para>
</listitem>
***************
*** 300,303 ****
--- 302,347 ----
</itemizedlist>
+
+ <para> As of 1.2.11, there is an <emphasis>even better idea</emphasis>
+ for application of logs, as the sequencing of their names becomes more
+ predictable.</para>
+
+ <itemizedlist>
+
+ <listitem><para> The table, on the log shipped node, tracks which log
+ it most recently applied in table
+ <envar>sl_archive_tracking</envar>. </para>
+
+ <para> Thus, you may predict the ID number of the next file by taking
+ the latest counter from this table and adding 1.</para>
+ </listitem>
+
+ <listitem><para> There is still variation as to the filename,
+ depending on what the overall set of nodes in the cluster are. All
+ nodes periodically generate <command>SYNC</command> events, even if
+ they are not an origin node, and the log shipping system does generate
+ logs for such events. </para>
+
+ <para> As a result, when searching for the next file, it is necessary
+ to search for files in a manner similar to the following:
+
+ <programlisting>
+ ARCHIVEDIR=/var/spool/slony/archivelogs/node4
+ SLONYCLUSTER=mycluster
+ PGDATABASE=logshipdb
+ PGHOST=logshiphost
+ NEXTQUERY="select at_counter+1 from \"_${SLONYCLUSTER}\".sl_archive_tracking;"
+ nextseq=`psql -d ${PGDATABASE} -h ${PGHOST} -A -t -c "${NEXTQUERY}"
+ filespec=`printf "slony1_log_*_%20d.sql"
+ for file in `find $ARCHIVEDIR -name "${filespec}"; do
+ psql -d ${PGDATABASE} -h ${PGHOST} -f ${file}
+ done
+ </programlisting>
+ </para>
+ </listitem>
+
+ <listitem><para> </para> </listitem>
+ </itemizedlist>
+
</sect2>
</sect1>
- Previous message: [Slony1-commit] slony1-engine/doc/adminguide schemadoc.xml
- Next message: [Slony1-commit] slony1-engine/doc/adminguide logshipping.sgml slon.sgml
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
More information about the Slony1-commit mailing list