[musicians-guide] Added all internal and external links

crantila crantila at fedoraproject.org
Fri Aug 6 02:54:03 UTC 2010


commit 6255968fb5213979894d80400918fcf0560a3c7a
Author: Christopher Antila <crantila at fedoraproject.org>
Date:   Thu Aug 5 22:53:20 2010 -0400

    Added all internal and external links

 en-US/Ardour.xml                                   |   31 ++++++-----
 en-US/Audio_Vocabulary.xml                         |   26 +++++-----
 en-US/Digital_Audio_Workstations.xml               |   38 +++++++-------
 en-US/FluidSynth.xml                               |   20 ++++----
 en-US/Frescobaldi.xml                              |    4 +-
 en-US/LilyPond/LilyPond-counterpoint.xml           |    9 ++-
 en-US/LilyPond/LilyPond-orchestra.xml              |    7 ++-
 en-US/LilyPond/LilyPond-piano.xml                  |    5 +-
 en-US/LilyPond/LilyPond-syntax.xml                 |    4 +-
 en-US/LilyPond/LilyPond.xml                        |   20 +++----
 en-US/Qtractor.xml                                 |   36 ++++---------
 en-US/Real_Time_and_Low_Latency.xml                |    4 +-
 en-US/Revision_History.xml                         |   15 ++++++
 en-US/Rosegarden.xml                               |   13 ++---
 en-US/Solfege.xml                                  |   14 +++---
 en-US/Sound_Cards.xml                              |    7 ++-
 en-US/Sound_Servers.xml                            |   15 +++---
 .../SuperCollider-Basic_Programming.xml            |   54 +++++++++++---------
 en-US/SuperCollider/SuperCollider-Composing.xml    |   14 +++---
 en-US/SuperCollider/SuperCollider-Exporting.xml    |    8 ++--
 en-US/SuperCollider/SuperCollider.xml              |   22 ++++----
 21 files changed, 187 insertions(+), 179 deletions(-)
---
diff --git a/en-US/Ardour.xml b/en-US/Ardour.xml
index 8b386be..255a39c 100644
--- a/en-US/Ardour.xml
+++ b/en-US/Ardour.xml
@@ -17,13 +17,13 @@
 		<section id="sect-Musicians_Guide-Ardour-Knowledge_Requirements">
 			<title>Knowledge Requirements</title>
 			<para>
-				Ardour's user interface is similar to other DAWs.  We recommend that you read !!L!!common interface!!L!! if you have not used a DAW before.
+				Ardour's user interface is similar to other DAWs.  We recommend that you read <xref linkend="sect-Musicians_Guide-DAW_User_Interface" /> if you have not used a DAW before.
 			</para>
 		</section>
 		<section id="sect-Musicians_Guide-Ardour-Software_Requirements">
 			<title>Software Requirements</title>
 			<para>
-				Ardour uses the JACK Audio Connection Kit.  You should install JACK before installing Ardour.  Follow the instructions !!L!! here !!L!! to install JACK.
+				Ardour uses the JACK Audio Connection Kit.  You should install JACK before installing Ardour.  Follow the instructions in <xref linkend="sect-Musicians_Guide-Install_and_Configure_JACK" /> to install JACK.
 			</para>
 		</section>
 		<section id="sect-Musicians_Guide-Ardour-Hardware_Requirements">
@@ -56,19 +56,19 @@
 		<section id="sect-Musicians_Guide-Ardour-Recording-Interface">
 			<title>The Interface</title>
 			<para>
-				This section explains some of the graphical interface components that are unique to Ardour.  Components that are consistent through most DAWs are explained in [[User:Crantila/FSC/Recording/DAW_Common_Elements#User_Interface|the "Common Features" section]].
+				This section explains some of the graphical interface components that are unique to Ardour.  Components that are consistent through most DAWs are explained in <xref linkend="sect-Musicians_Guide-DAW_Interface_Vocabulary" />.
 			</para>
 			<para>
 				<!-- [[File:Ardour-interface.xcf]] -->
-				[[File:Ardour-interface-editor_mixer.png|300px|Editor mixer]]
+				[[File:Ardour-interface-editor_mixer.png|300px|Editor mixer]] !!P!!
 				This image shoes the editor mixer, located at the left of Ardour's main window.  The editor mixer shows only one mixer strip at a time.  It shows the fader and its controls, in the middle of the mixer strip, the panner and its controls, at the bottom of the mixer strip, and the "Comments" and outgoing connections buttons.
 			</para>
 			<para>
-				[[File:Ardour-interface-session_sidebar.png|300px|Session sidebar]]
+				[[File:Ardour-interface-session_sidebar.png|300px|Session sidebar]] !!P!!
 				This image shows the session sidebar, located at the right of Ardour's main window.  In this image, the "Regions" tab is selected, so the sidebar shows a list of regions currently in the session.  You can see blue ones which were directly imported, white ones which were created from blue regions, and the arrows to the left of some blue regions, indicating that there are white-coloured sub-regions associated with those blue regions.
 			</para>
 			<para>
-				[[File:Ardour-interface-toolbar.png|300px|Toolbar]]
+				[[File:Ardour-interface-toolbar.png|300px|Toolbar]] !!P!!
 				This image shows the main toolbar, located underneath the transport controls, and above the timeline and its rulers.  In the middle of the toolbar are three unlabeled, but highly useful multiple-choice menus: the "snap mode" menu (currently set to "No Grid"); the "grid mode" menu (currently set to "Bars"); and then "edit point" menu (currently set to "Mouse").  To the left of these menus are the tool-selection buttons, the most important of which are the two left-most buttons: Select/Edit Object, and Select/Edit Range.
 			</para>
 		</section>
@@ -167,7 +167,7 @@
 			</para>
 			<para>
 				It's easy to imagine how Ardour acts when it records silence.  When Ardour thinks that a portion of audio is too loud, it outlines the wave-form representation in red, as shown in this image:
-				[[Ardour-red_peaks.png|This audio is too loud.]]
+				[[Ardour-red_peaks.png|This audio is too loud.]] !!P!!
 			</para>
 			<para>
 				There are three simple strategies that can be used to change the input level of an audio signal:
@@ -373,7 +373,7 @@
 	<section id="sect-Musicians_Guide-Ardour-Tutorial_Files">
 		<title>Files for the Tutorial</title>
 		<para>
-			!!L!! Links to tutorial files !!L!!  These tutorial files represent the material required to create a finished version of a song called "Here Is How," written by Esther Wheaton.  The song was released as part of her first album, "Not Legendary," and she has released the source files for this song under !!I!! this licence (probably CC-BY-SA) !!I!!  For more information on the artist, please refer to her [http://www.myspace.com/estherwheaton MySpace page].
+			These tutorial files represent the material required to create a finished version of a song called "Here Is How," written by Esther Wheaton.  The song was released as part of her first album, "Not Legendary," and she has released the source files for this song under !!I!! this licence (probably CC-BY-SA) !!I!!  For more information on the artist, please refer to her <citetitle>Esther Wheaton's MySpace Page</citetitle>, available at <ulink url="http://www.myspace.com/estherwheaton" />.
 		</para>
 		<para>
 			The material presented for your use is a folder containing an Ardour file and the associated audio files required to start the tutorial.  The tutorial itself comprises the following sections about editing, mixing, and mastering (or exporting).  The program used to record the audio files split the left and right channels into separate files, so they are imported into Ardour as separate regions.  Therefore, the setup is more complex than it would be if the song were originally recorded in Ardour, but this gives the opportunity to learn in greater detail about busses, creating and using the stereo image, and volume level adjustments.
@@ -381,6 +381,9 @@
 		<para>
 			The unique setup also means that none of the audio regions are in the right place on the timeline, and most of them require extensive editing.  This would be bad if the objective of the tutorial were to create a finished version of the song as quickly as possible; but the objective is to learn how to use Ardour, and this is almost guaranteed.<!-- !! what about the singers !! (What was this supposed to mean?) -->
 		</para>
+		<para>
+			!!EL!! Links to the files !!I!! I don't know where to put them!<!-- TODO -->
+		</para>
 	</section>
 	
 	<section id="sect-Musicians_Guide-Ardour-Editing">
@@ -389,7 +392,7 @@
 			This section covers the basics of preparing "Here Is How."  The focus is on trimming the regions and placing them in the right position on the timeline.  Since the goal is to replicate the form of the original song, there is little room for artistic freedom.
 		</para>
 		<para>
-			To get the most out of this section, you should use the tutorial files provided above.  By following the instructions with the tutorial file, you will be able to use real editing, mixing, and mastering techniques to create a real song.  The contents of the tutorial files, along with information on how to get them, are posted above in the [[User:Crantila/FSC/Recording/Ardour#Tutorial Files|Tutorial Files]] section.
+			To get the most out of this section, you should use the tutorial files provided above.  By following the instructions with the tutorial file, you will be able to use real editing, mixing, and mastering techniques to create a real song.  Instructions to get the tutorial files are available in <xref linkend="sect-Musicians_Guide-Ardour-Tutorial_Files" />.
 		</para>
 		
 		<section id="sect-Musicians_Guide-Ardour-Editing-Add_Tracks_and_Busses">
@@ -1093,7 +1096,7 @@
 			<section id="sect-Musicians_Guide-Ardour-Mastering-Choosing_Export_Format">
 				<title>Choose the Export Format</title>
 				<para>
-					Ardour offers quite a variety of output formats, and knowing which to choose can be baffling.  Not all options are available with all file types.  Fedora Linux does not support MP3 files by default, for legal reasons.  For more information, refer to [http://fedoraproject.org/wiki/Multimedia/MP3 this web page].
+					Ardour offers quite a variety of output formats, and knowing which to choose can be baffling.  Not all options are available with all file types.  Fedora Linux does not support MP3 files by default, for legal reasons.  For more information, refer to <citetitle>MP3 (Fedora Project Wiki)</citetitle> <ulink url="http://fedoraproject.org/wiki/Multimedia/MP3" />.
 				</para>
 				<para>
 					The tutorial's regions have 24-bit samples, recorded at a 48 kHz rate.  Exporting any part of the session with a higher sample format or sample rate is likely to result in decreased audio quality.
@@ -1103,12 +1106,12 @@
 					<itemizedlist>
 					<listitem><para>WAV: An uncompressed format designed by Microsoft.  Recommended only if further audio manipulation is intended.  Carries only audio data, so information like title, artist, and composer will be lost.  Playable with almost any device.</para></listitem>
 					<listitem><para>AIFF: An uncompressed format designed by Apple.  Recommended only if further audio manipulation is intended.  Carries only audio data, so information like title, artist, and composer will be lost.  Playable with almost any DAW and some audio players.</para></listitem>
-					<listitem><para>FLAC: An open-source compressed format.  A "lossless" format, meaning no audio information is lost during compression and decompression.  Audio quality is equal to WAV or AIFF formats.  Capable of carrying metadata, so information like title, artist, and composer will be preserved.  Widely supported in Linux by default.  For other popular operating systems, refer to [http://flac.sourceforge.net/download.html this web page] for a list of applications capable of playing FLAC files.  This is usually the best choice for distributing high-quality audio to listeners.</para></listitem>
-					<listitem><para>Ogg/Vorbis: An open-source compressed format.  A "lossy" format, meaning some audio information is lost during compression and decompression.  Audio quality is less than WAV or AIFF formats, but usually better than MP3.  Capable of carrying metadata, so information like title, artist, and composer will be preserved.  Widely supported in Linux by default.  For other popular operating systems, following the instructions on [http://www.vorbis.com/ this web page].  This is a good choice for distributing good-quality audio to listeners.</para></listitem>
+					<listitem><para>FLAC: An open-source compressed format.  A "lossless" format, meaning no audio information is lost during compression and decompression.  Audio quality is equal to WAV or AIFF formats.  Capable of carrying metadata, so information like title, artist, and composer will be preserved.  Widely supported in Linux by default.  For other popular operating systems, refer to <citetitle>Download Extras (FLAC Website)</citetitle> at <ulink url="http://flac.sourceforge.net/download.html#extras" /> for a list of applications and programs capable of playing FLAC files.  This is usually the best choice for distributing high-quality audio to listeners.</para></listitem>
+					<listitem><para>Ogg/Vorbis: An open-source compressed format.  A "lossy" format, meaning some audio information is lost during compression and decompression.  Audio quality is less than WAV or AIFF formats, but usually better than MP3.  Capable of carrying metadata, so information like title, artist, and composer will be preserved.  Widely supported in Linux by default.  For other popular operating systems, following the instructions on the <citetitle>Vorbis Website</citetitle> <ulink url="http://www.vorbis.com/" />.  This is a good choice for distributing good-quality audio to listeners.</para></listitem>
 					</itemizedlist>
 				</para>
 				<para>
-					A higher setting for !!L!!sample format!!L!! allows a greater amount of audio information to be stored per sample.  32&nbsp;bit support is virtually non-existant, but and you will probably not need to use this format in the near future.  The "float" format stores samples in a different internal format, and you will need it only rarely.
+					A higher setting for the sample format (explained in <xref linkend="sect-Musicians_Guide-Sample_Format" />) allows a greater amount of audio information to be stored per sample.  32&nbsp;bit support is virtually non-existant, but and you will probably not need to use this format in the near future.  The "float" format stores samples in a different internal format, and you will need it only rarely.
 				</para>
 				<para>
 					If you are exporting audio for high-end equipment, or for further processing, choose the 24-bit format.  Otherwise, choose the 16-bit format, which is the sample format of audio CDs.
@@ -1117,7 +1120,7 @@
 					"Sample endianness" is a difficult concept to understand, and it has no effect on the resulting audio - just how it is stored..  Unless you are using a rare PowerPC computer, choose the "Little-endian (Intel)" option.
 				</para>
 				<para>
-					A higher !!L!!sample rate!!L!! allows a greater amount of audio information to be stored, but increases the size of audio files.
+					A higher sample rate (explained in <xref linkend="sect-Musicians_Guide-Sample_Rate" /> allows a greater amount of audio information to be stored, but increases the size of audio files.
 					<!-- <itemizedlist>
 					<listitem><para>22.05&nbsp;kHz: A low sample rate, and possibly insufficient.</para></listitem>
 					<listitem><para>44.1&nbsp;kHz: A good, standard sample rate.  This is the sample rate of audio CDs, and it is always a safe choice.</para></listitem>
diff --git a/en-US/Audio_Vocabulary.xml b/en-US/Audio_Vocabulary.xml
index 2a7eece..fa22d0e 100644
--- a/en-US/Audio_Vocabulary.xml
+++ b/en-US/Audio_Vocabulary.xml
@@ -22,8 +22,8 @@
 		<para>
 			<!-- [[File:FMG-bus.xcf]] -->
 			<!-- [[File:FMG-master_sub_bus.xcf]] -->
-			[[File:FMG-bus.png|200px|How audio busses work.]]
-			[[File:FMG-master_sub_bus.png|200px|The relationship between the master bus and sub-master busses.]]
+			[[File:FMG-bus.png|200px|How audio busses work.]] !!P!!
+			[[File:FMG-master_sub_bus.png|200px|The relationship between the master bus and sub-master busses.]] !!P!!
 		</para>
 		<para>
 			An '''audio bus''' sends audio signals from one place to another.  Many different signals can be inputted to a bus simultaneously, and many different devices or applications can read from a bus simultaneously.  Signals inputted to a bus are mixed together, and cannot be separated after entering a bus.  All devices or applications reading from a bus receive the same signal.
@@ -52,14 +52,14 @@
 		</para>
 		<para>
 			<itemizedlist>
-			<listitem><para>[http://www.digido.com/level-practices-part-2-includes-the-k-system.html "Level Practices"] (the type of meter described here is available in the "jkmeter" package from Planet CCRMA at Home).</para></listitem>
-			<listitem><para>[http://en.wikipedia.org/wiki/K-system "K-system"]</para></listitem>
-			<listitem><para>[http://en.wikipedia.org/wiki/Headroom_%28audio_signal_processing%29 "Headroom"]</para></listitem>
-			<listitem><para>[http://en.wikipedia.org/wiki/Equal-loudness_contour "Equal-loudness contour"]</para></listitem>
-			<listitem><para>[http://en.wikipedia.org/wiki/Sound_level_meter "Sound level meter"]</para></listitem>
-			<listitem><para>[http://en.wikipedia.org/wiki/Listener_fatigue "Listener fatigue"]</para></listitem>
-			<listitem><para>[http://en.wikipedia.org/wiki/Dynamic_range_compression "Dynamic range compression"]</para></listitem>
-			<listitem><para>[http://en.wikipedia.org/wiki/Alignment_level "Alignment level"]</para></listitem>
+			<listitem><para><citetitle>Level Practices</citetitle>, available at <ulink url="http://www.digido.com/level-practices-part-2-includes-the-k-system.html" />.  The type of meter described here is available in the "jkmeter" package from Planet CCRMA at Home.</para></listitem>
+			<listitem><para><citetitle>K-system (Wikipedia)</citetitle>, available at <ulink url="http://en.wikipedia.org/wiki/K-system" />.</para></listitem>
+			<listitem><para><citetitle>Headroom (Wikipedai)</citetitle>, available at <ulink url="http://en.wikipedia.org/wiki/Headroom_%28audio_signal_processing%29" />.</para></listitem>
+			<listitem><para><citetitle>Equal-Loudness Contour (Wikipedia)</citetitle>, available at <ulink url="http://en.wikipedia.org/wiki/Equal-loudness_contour" />.</para></listitem>
+			<listitem><para><citetitle>Sound Level Meter (Wikipedia)</citetitle>, available at <ulink url="http://en.wikipedia.org/wiki/Sound_level_meter" />.</para></listitem>
+			<listitem><para><citetitle>Listener Fatigue (Wikipedia)</citetitle>, available at <ulink url="http://en.wikipedia.org/wiki/Listener_fatigue" />.</para></listitem>
+			<listitem><para><citetitle>Dynamic Range Compression (Wikipedia)</citetitle>, available at <ulink url="http://en.wikipedia.org/wiki/Dynamic_range_compression" />.</para></listitem>
+			<listitem><para><citetitle>Alignment Level (Wikipedia)</citetitle>, available at <ulink url="http://en.wikipedia.org/wiki/Alignment_level" />.</para></listitem>
 			</itemizedlist>
 		</para>
 	</section>
@@ -67,7 +67,7 @@
 	<section id="sect-Musicians_Guide-Vocabulary-Panning_and_Balance">
 		<title>Panning and Balance</title>
 		<para>
-			[[File:FMG-Balance_and_Panning.png|200px|left|The difference between adjusting panning and adjusting balance.]]
+			[[File:FMG-Balance_and_Panning.png|200px|left|The difference between adjusting panning and adjusting balance.]] !!P!!
 			<!-- [[File:FMG-Balance_and_Panning.xcf]] -->
 		</para>
 		<para>
@@ -92,7 +92,7 @@
 			<listitem><para>Bars and Beats: Usually used for MIDI work, and called "BBT," meaning "Bars, Beats, and Ticks."  A tick is a partial beat.</para></listitem>
 			<listitem><para>Minutes and Seconds: Usually used for audio work.</para></listitem>
 			<listitem><para>SMPTE Timecode: Invented for high-precision coordination of audio and video, but can be used with audio alone.</para></listitem>
-			<listitem><para>Samples: Relating directly to the format of the underlying audio file, a sample is the shortest possible length of time in an audio file.  See [[User:Crantila/FSC/Sound_Cards#Sample_Rate|this section]] for more information on samples.</para></listitem>
+			<listitem><para>Samples: Relating directly to the format of the underlying audio file, a sample is the shortest possible length of time in an audio file.  See <xref linkend="sect-Musicians_Guide-Sample_Rate_and_Sample_Format" /> for more information.</para></listitem>
 			</itemizedlist>
 		</para>
 		<para>
@@ -113,7 +113,7 @@
 	<section id="sect-Musicians_Guide-Vocabulary-Routing_and_Multiplexing">
 		<title>Routing and Multiplexing</title>
 		<para>
-			[[File:FMG-routing_and_multiplexing.png|200px|left|Illustration of routing and multiplexing in the "Connections" window of the QjackCtl interface.]]
+			[[File:FMG-routing_and_multiplexing.png|200px|left|Illustration of routing and multiplexing in the "Connections" window of the QjackCtl interface.]] !!P!!
 			<!-- [[FMG-routing_and_multiplexing.xcf]] -->
 		</para>
 		<para>
diff --git a/en-US/Digital_Audio_Workstations.xml b/en-US/Digital_Audio_Workstations.xml
index 637917b..9759a7a 100644
--- a/en-US/Digital_Audio_Workstations.xml
+++ b/en-US/Digital_Audio_Workstations.xml
@@ -10,7 +10,7 @@
 		The term '''Digital Audio Workstation''' (henceforth '''DAW''') refers to the entire hardware and software setup used for professional (or professional-quality) audio recording, manipulation, synthesis, and production.  It originally referred to devices purpose-built for the task, but as personal computers have become more powerful and wide-spread, certain specially-designed personal computers can also be thought of as DAWs.  The software running on these computers, especially software capable of multi-track recording, playback, and synthesis, is simply called "DAW software," which is often shortened to "DAW."  So, the term "DAW" and its usage are moderately ambiguous, but generally refer to one of the things mentioned.
    </para>
    <para>
-	   The !!L!! "Sound Cards and Digital Audio" Section !!L!! has other words that are important to know.
+	   For other terms related to digital audio, see <xref linkend="chap-Musicians_Guide-Sound_Cards" />.
    </para>
 	
 	<section id="sect-Musicians_Guide-Knowing_Which_DAW_to_Use">
@@ -78,9 +78,9 @@
    	   	<para>
 					It takes experience and practice to gain the skills involved in successful recording, mixing, and mastering.  Further information about these procedures is available from many places, including these web pages:
 					<itemizedlist>
-					<listitem><para>[http://www.64studio.com/howto-mastering "Mastering your final mix"]</para></listitem>
-					<listitem><para>[http://en.wikipedia.org/wiki/Audio_mixing_%28recorded_music%29 "Audio mixing (recorded music)"]</para></listitem>
-					<listitem><para>[http://en.wikipedia.org/wiki/Multitrack_recording "Multitrack recording"]</para></listitem>
+					<listitem><para><citetitle>Mastering Your Final Mix (64studio)</citetitle>, available at <ulink url="http://www.64studio.com/howto-mastering" />.</para></listitem>
+					<listitem><para><citetitle>Audio Mixing (Wikipedia)</citetitle>, available at <ulink url="http://en.wikipedia.org/wiki/Audio_mixing_%28recorded_music%29" />.</para></listitem>
+					<listitem><para><citetitle>Multitrack Recording (Wikipedia)</citetitle>, available at <ulink url="http://en.wikipedia.org/wiki/Multitrack_recording" />.</para></listitem>
 					</itemizedlist>
       		</para>
       	</section>
@@ -131,7 +131,7 @@
 		   <title>Relationship of Session, Track, and Region</title>
 		   <para>
 				<!-- [[File:Ardour-session_track_region.xcf]] -->
-				[[File:Ardour-session_track_region.png|200px|left|Session, Track, and Region in Ardour.]]
+				[[File:Ardour-session_track_region.png|200px|left|Session, Track, and Region in Ardour.]] !!P!!
 		   </para>
 	   </section>
 	
@@ -156,21 +156,21 @@
 	<section id="sect-Musicians_Guide-DAW_User_Interface">
 		<title>User Interface</title>
 		<para>
-		<!--
-File:Qtractor-interface.xcf
-File:Qtractor-interface-clocks.png
-File:Qtractor-interface-messages.png
-File:Qtractor-interface-track.png
-File:Qtractor-interface-track_info.png
-File:Qtractor-interface-transport.png
--->
-This section describes various components of software-based DAW interfaces.  Although the Qtractor application is visible in the images, both Ardour and Rosegarden (along with most other DAW software) have an interface that differs only in details, such as which buttons are located where.
+			<!--
+			File:Qtractor-interface.xcf
+			File:Qtractor-interface-clocks.png
+			File:Qtractor-interface-messages.png
+			File:Qtractor-interface-track.png
+			File:Qtractor-interface-track_info.png
+			File:Qtractor-interface-transport.png
+			-->
+			This section describes various components of software-based DAW interfaces.  Although the Qtractor application is visible in the images, both Ardour and Rosegarden (along with most other DAW software) have an interface that differs only in details, such as which buttons are located where.
 		</para>
 	
 	   <section id="sect-Musicians_Guide-Messages_Pane">
 		   <title>"Messages" Pane</title>
 		   <para>
-				[[File:Qtractor-interface-messages.png|300px|"Messages" Pane]]
+				[[File:Qtractor-interface-messages.png|300px|"Messages" Pane]] !!P!!
 
 				The "messages" pane, shown in the above diagram, contains messages produced by the DAW, and sometimes messages produced by software used by the DAW, such as JACK.  If an error occurs, or if the DAW does not perform as expected, you should check the "messages" pane for information that may help you to get the desired results.  The "messages" pane can also be used to determine whether JACK and the DAW were started successfully, with the options you prefer.
 		   </para>
@@ -179,7 +179,7 @@ This section describes various components of software-based DAW interfaces.  Alt
 	   <section id="sect-Musicians_Guide-DAW_Clock">
 		   <title>Clock</title>
 		   <para>
-				[[File:Qtractor-interface-clocks.png|300px|Clock]]
+				[[File:Qtractor-interface-clocks.png|300px|Clock]] !!P!!
 
 				The clock shows the current place in the file, as indicated by the transport.  In the image, you can see that the transport is at the beginning of the session, so the clock indicates "0".  This clock is configured to show time in minutes and seconds, so it is a "time clock."  Other possible settings for clocks are to show BBT (bars, beats, and ticks - a "MIDI clock"), samples (a "sample clock"), or an SMPTE timecode (used for high-precision synchronization, usually with video - a "timecode clock").  Some DAWs allow the use of multiple clocks simultaneously.
          </para>
@@ -191,7 +191,7 @@ This section describes various components of software-based DAW interfaces.  Alt
 	   <section id="sect-Musicians_Guide-Track_Info_Pane">
 		   <title>"Track Info" Pane</title>
 		   <para>
-				[[File:Qtractor-interface-track_info.png|300px|"Track Info" Pane]]
+				[[File:Qtractor-interface-track_info.png|300px|"Track Info" Pane]] !!P!!
 
 				The "track info" pane contains information and settings for each track and bus in the session.  Here, you can usually adjust settings like the routing of a track's or bus' input and output routing, the instrument, bank, program, and channel of MIDI tracks, and the three buttons shown on this image: "R" for "arm to record," "M" for "mute/silence track's output," and "S" for "solo mode," where only the selected tracks and busses are heard.
          </para>
@@ -206,7 +206,7 @@ This section describes various components of software-based DAW interfaces.  Alt
 	   <section id="sect-Musicians_Guide-Track_Pane">
 		   <title>"Track" Pane</title>
 		   <para>
-				[[File:Qtractor-interface-track.png|300px|"Track" Pane]]
+				[[File:Qtractor-interface-track.png|300px|"Track" Pane]] !!P!!
 
 				The "track" pane is the main workspace in a DAW.  It shows regions (also called "clips") with a rough overview of the audio wave-form or MIDI notes, allows you to adjust the starting-time and length of regions, and also allows you to assign or re-assign a region to a track.  The "track" pane shows the transport as a vertical line; in this image it is the left-most red line in the "track" pane.
          </para>
@@ -218,7 +218,7 @@ This section describes various components of software-based DAW interfaces.  Alt
 	   <section id="sect-Musicians_Guide-DAW_Transport_Controls">
 		   <title>Transport Controls</title>
 		   <para>
-			   [[File:Qtractor-interface-transport.png|300px|Transport Controls]]
+			   [[File:Qtractor-interface-transport.png|300px|Transport Controls]] !!P!!
 			   
 			   The transport controls allow you to manipulate the transport in various ways.  The shape of the buttons is somewhat standardized; a similar-looking button will usually perform the same function in all DAWs, as well as in consumer electronic devices like CD players and DVD players.
          </para>
diff --git a/en-US/FluidSynth.xml b/en-US/FluidSynth.xml
index a597863..ef2e598 100644
--- a/en-US/FluidSynth.xml
+++ b/en-US/FluidSynth.xml
@@ -26,10 +26,10 @@
 			<para>
 				There is a large selection of SoundFonts available for free on the internet, and some are also available for purchase, including a few very high quality SoundFonts.  The following three websites have links to SoundFont resources, and some SoundFonts available for paid or free download.  No guarantee is made of the quality of the material provided, or of the quality and security of the websites.
 				<itemizedlist>
-				<listitem><para>[http://www.schristiancollins.com/generaluser.php S. Christian Collins' "GeneralUser" SoundFont]</para></listitem>
-				<listitem><para>[http://www.hammersound.net/cgi-bin/soundlink.pl HammerSound SoundFont Library]</para></listitem>
-				<listitem><para>[http://soundfonts.homemusician.net/ homemusician.net's SoundFont Library]</para></listitem>
-				<listitem><para>[http://www.synthzone.com/soundfont.htm Synth Zone]</para></listitem>
+				<listitem><para><citetitle>S. Christian Collins' "General User" SoundFont</citetitle>, available from <ulink url="http://www.schristiancollins.com/generaluser.php" />.</para></listitem>
+				<listitem><para><citetitle>HammerSound SoundFont Library</citetitle>, available at <ulink url="http://www.hammersound.net/cgi-bin/soundlink.pl" />.</para></listitem>
+				<listitem><para><citetitle>homemusician.net SoundFont Library</citetitle>, available at <ulink url="http://soundfonts.homemusician.net/" />.</para></listitem>
+				<listitem><para><citetitle>Synth Zone</citetitle>, available at <ulink url="http://www.synthzone.com/soundfont.htm" />.</para></listitem>
 				</itemizedlist>
 			</para>
 			<para>
@@ -68,7 +68,7 @@
 		<section id="sect-Musicians_Guide-FluidSynth-Req_and_Inst-Software_Requirements">
 			<title>Software Requirements</title>
 			<para>
-				FluidSynth requires the JACK Audio Connection Kit.  If you have not already installed the JACK packages from the Planet CCRMA at Home repository, then it is recommended that you do so ''before'' installing FluidSynth.  Follow the instructions [[User:Crantila/FSC/Sound_Servers#Installing_JACK|here]].
+				FluidSynth requires the JACK Audio Connection Kit.  If you have not already installed the JACK packages from the Planet CCRMA at Home repository, then it is recommended that you do so ''before'' installing FluidSynth.  See <xref linkend="sect-Musicians_Guide-Install_and_Configure_JACK" /> for instructions.
 			</para>
 		</section>
 		<section id="sect-Musicians_Guide-FluidSynth-Req_and_Inst-Installation">
@@ -89,7 +89,7 @@
 			<step><para>Use "PackageKit" or "KPackageKit" to install the "qsynth" package.</para></step>
 			<step><para>Review and approve the proposed installation:
 				<itemizedlist>
-				<listitem><para>If the "jack-audio-connection-kit" package is listed, it is recommended that you follow [[User:Crantila/FSC/Sound_Servers#Installing_JACK|these instructions]] to install the JACK Audio Connection Kit before installing Qsynth.</para></listitem>
+				<listitem><para><!-- this was another warning to install JACK but I don't want it any more --></para></listitem>
 				<listitem><para>The installation may include the "fluid-soundfont-gm" package, which is quite large.</para></listitem>
 				</itemizedlist>
 				</para></step>
@@ -105,7 +105,7 @@
 				# Use PackageKit or KPackageKit to install the ''fluidsynth'' package, or use a terminal to run <code>su -c 'yum install fluidsynth'</code>
 			</para>
 		</section>
-		<section id="sect-Musicians_Guide-FluidSynth-Req_and_Inst-">
+		<section id="sect-Musicians_Guide-FluidSynth-Req_and_Inst">
 			<title>Installation of SoundFont Files</title>
 			<para>
 				Qsynth automatically installs a SoundFont for use with FluidSynth, but if you did not install Qsynth, or if you want to add additional SoundFont files with additional programs, you will need to install them separately.  The Fedora package repositories offer a small selection of SoundFont files, which you can find by searching for "soundfont" with PackageKit, KPackageKit, or yum.  These files will automatically be installed correctly.  If you wish to install additional SoundFont files, it is recommended that you install them in the same location - and with the same security settings - as the ones available from the Fedora repositories.  If you do this, then you enable all users of the computer system to access the files, you will not "lose" them if you forget where they are stored, and you help to minimize the potential security risk of using software downloaded from the internet.
@@ -170,7 +170,7 @@
 				To configure an additional Soundfount:
 			</para>
 			<procedure>
-			<step><para>Click on the 'Open' button, and navigate to the path of the SoundFont you wish to add.  This should be <code>/usr/share/soundfonts</code>, if installed to the standard location specified !!L!!above.!!L!!</para></step>
+			<step><para>Click on the 'Open' button, and navigate to the path of the SoundFont you wish to add.  This should be <code>/usr/share/soundfonts</code>, if installed to the standard location specified in <xref linkend="sect-Musicians_Guide-FluidSynth-Req_and_Inst" />.</para></step>
 			<step><para>Select the additional SoundFont, then click the 'Open' button.</para></step>
 			<step><para>To change the SoundFont ID (SFID), use the 'Up' and 'Down' buttons to change the position of the SoundFonts, as desired.  This does not directly change the function of FluidSynth - any SoundFont should work with any SoundFont ID number.</para></step>
 			</procedure>
@@ -178,7 +178,7 @@
 		<section id="sect-Musicians_Guide-FluidSynth-Configuring-JACK_Output">
 			<title>JACK Output Configuration</title>
 			<para>
-				It is possible to configure FluidSynth to output synthesized audio either to JACK or to ALSA.  The default, and recommended, method is to output synthesized audio to JACK.  This allows the greatest control over audio quality, and the greatest flexibility in terms of routing and multiplexing (see !!L!! [[User:Crantila/FSC/Recording/DAW_Common_Elements#Routing_and_Multiplexing|definition here]] !!L!!), which allows you to simultaneously record the synthesized audio signal and listen to it.
+				It is possible to configure FluidSynth to output synthesized audio either to JACK or to ALSA.  The default, and recommended, method is to output synthesized audio to JACK.  This allows the greatest control over audio quality, and the greatest flexibility in terms of routing and multiplexing (for definition see <xref linkend="sect-Musicians_Guide-Vocabulary-Routing_and_Multiplexing" />), which allows you to simultaneously record the synthesized audio signal and listen to it.
 			</para>
 			<para>
 				If you are having problems, you may wish to confirm that Qsynth is configured correctly to use JACK.
@@ -215,7 +215,7 @@
 				<listitem><para>When set to "jack", the input will appear on QjackCtl's "MIDI" tab in the "Connect" window.  This is useful if the MIDI generator device that you are using, such as Rosegarden, is connected directly to JACK.</para></listitem>
 				</itemizedlist>
 				</para></step>
-			<step><para>You can set the number of MIDI input channels provided by FluidSynth.  Refer to the information below, in !!L!!"Changing the Number of MIDI Input Channels"!!L!!</para></step>
+			<step><para>You can set the number of MIDI input channels provided by FluidSynth.  Refer to <xref linkend="sect-Musicians_Guide-FluidSynth-Changing_Number_of_Input_Channels" /> below.</para></step>
 			</procedure>
 		</section>
 		<section id="sect-Musicians_Guide-FluidSynth-Configuring-Viewing_all_Settings">
diff --git a/en-US/Frescobaldi.xml b/en-US/Frescobaldi.xml
index 9e4a4a0..33865b7 100644
--- a/en-US/Frescobaldi.xml
+++ b/en-US/Frescobaldi.xml
@@ -33,7 +33,7 @@
 			</itemizedlist>
 		</para>
 		<para>
-			For further information on Frescobaldi, please refer to the project's [http://www.frescobaldi.org/ website].
+			For more information refer to the <citetitle>Frescobaldi Website</citetitle> <ulink url="at http://www.frescobaldi.org/" />.
 		</para>
 	</section>
 	
@@ -87,7 +87,7 @@
 	<section id="sect-Musicians_Guide-Frescobaldi-Using">
 		<title>Using Frescobaldi</title>
 		<para>
-			The practical use of Frescobaldi for editing LilyPond source files is described in the "LilyPond" chapter of this Guide, available [[User:Crantila/FSC/Typesetting/LilyPond|here]].
+			The practical use of Frescobaldi for editing LilyPond source files is described in <xref linkend="chap-Musicians_Guide-LilyPond" />.
 		</para>
 	</section>
 	
diff --git a/en-US/LilyPond/LilyPond-counterpoint.xml b/en-US/LilyPond/LilyPond-counterpoint.xml
index 4557cf1..d00e317 100644
--- a/en-US/LilyPond/LilyPond-counterpoint.xml
+++ b/en-US/LilyPond/LilyPond-counterpoint.xml
@@ -7,14 +7,17 @@
 <section id="sect-Musicians_Guide-LilyPond-Counterpoint">
 	<title>Working on a Counterpoint Exercise (Tutorial)</title>
 	<para>
-		Imagine you're in Counterpoint class, and you've been asked to submit a very clean copy of your next assignment.  Since you don't want to pay $450,000 for a commercially-available engraving solution and a fruity computer to use it, you decide that LilyPond is the solution for you.
+		Imagine you're in Counterpoint class, and you've been asked to submit a very clean copy of your next assignment.  Since you don't want to pay $450,000 for a commercially-available engraving solution and a fruity computer to use it, you decide that LilyPond is the solution for you. <!-- Drop the sarcasm (TODO) -->
 	</para>
 	
 	<section id="sect-Musicians_Guide-LilyPond-Counterpoint-Tutorial_Files">
 		<title>Files for the Tutorial</title>
 		<para>
-			* [[User:Crantila/FSC/Typesetting/LilyPond/Counterpoint|Counterpoint (LilyPond)]]
-			* [[File:FSC-counterpoint.pdf]]
+			You do not need these files to do the tutorial; they are example completions.
+			<itemizedlist>
+			<listitem><para><citetitle>LilyPond Input File</citetitle> at <ulink url="https://fedoraproject.org/wiki/User:Crantila/FSC/Typesetting/LilyPond/Counterpoint" />.</para></listitem>
+			<listitem><para><citetitle>PDF Output File</citetitle> at <ulink url="https://fedoraproject.org/wiki/File:FSC-counterpoint.pdf" />.</para></listitem>
+			</itemizedlist>
 		</para>
 	</section>
 	
diff --git a/en-US/LilyPond/LilyPond-orchestra.xml b/en-US/LilyPond/LilyPond-orchestra.xml
index fbe2ff5..4c13cf6 100644
--- a/en-US/LilyPond/LilyPond-orchestra.xml
+++ b/en-US/LilyPond/LilyPond-orchestra.xml
@@ -19,8 +19,9 @@
 	<section id="sect-Musicians_Guide-LilyPond-Orchestra-Tutorial_Files">
 		<title>Files for the Tutorial</title>
 		<para>
-			* [[User:Crantila/FSC/Typesetting/LilyPond/Orchestra|Orchestra (LilyPond)]]
-			* [[File:FSC-orchestra.pdf]]
+			You do not need the LilyPond input file to do the tutorial.  You should use the input file if you encounter problems, and to compare your completion.  You do need the PDF output file to do the tutorial.
+			<citetitle>LilyPond Input File</citetitle> at <ulink url="https://fedoraproject.org/wiki/User:Crantila/FSC/Typesetting/LilyPond/Orchestra" />.
+			<citetitle>PDF Output File</citetitle> at <ulink url="https://fedoraproject.org/wiki/File:FSC-orchestra.pdf" />.
 		</para>
 	</section>
 	
@@ -170,7 +171,7 @@
 				  </substeps>
 				  </step>
 				<step><para>Now to add the "forte" marking.  You can add text (or any object, for that matter) onto a note (or rest, etc.) with one of these three symbols:
-				  <itemizedlist>
+				  <itemizedlist> <!-- This should refer to the syntax section (TODO) -->
 				  <listitem><para>^ meaning "put this above the object"</para></listitem>
 				  <listitem><para>- meaning "put this above or below, as you think is best"</para></listitem>
 				  <listitem><para>_ meaning "put this below the object"</para></listitem>
diff --git a/en-US/LilyPond/LilyPond-piano.xml b/en-US/LilyPond/LilyPond-piano.xml
index 67cba2b..9864ccd 100644
--- a/en-US/LilyPond/LilyPond-piano.xml
+++ b/en-US/LilyPond/LilyPond-piano.xml
@@ -19,8 +19,9 @@
 	<section id="sect-Musicians_Guide-LilyPond-Piano-Tutorial_Files">
 		<title>Files for the Tutorial</title>
 		<para>
-			* [[User:Crantila/FSC/Typesetting/LilyPond/Piano|Piano (LilyPond)]]
-			* [[File:FSC-piano.pdf]]
+			You do not need the LilyPond input file to do the tutorial.  You should use the input file if you encounter problems, and to compare your completion.  You do need the PDF output file to do the tutorial.
+			<citetitle>LilyPond Input File</citetitle> at <ulink url="https://fedoraproject.org/wiki/User:Crantila/FSC/Typesetting/LilyPond/Piano" />.
+			<citetitle>PDF Output File</citetitle> at <ulink url="https://fedoraproject.org/wiki/File:FSC-piano.pdf" />.
 		</para>
 	</section>
 	
diff --git a/en-US/LilyPond/LilyPond-syntax.xml b/en-US/LilyPond/LilyPond-syntax.xml
index def4068..6fa5c77 100644
--- a/en-US/LilyPond/LilyPond-syntax.xml
+++ b/en-US/LilyPond/LilyPond-syntax.xml
@@ -22,7 +22,7 @@
 			Pitch can be entered either absolutely, or relative to the preceding notes.  Usually (for music without frequent large leaps) it is more convenient to use the "relative" mode.  The symbols , and ' (comma and apostrophe) are used to indicate register.
 		</para>
 		<para>
-			When entering pitches absolutely, the register is indicated mostly as in the [http://en.wikipedia.org/wiki/Helmholtz_pitch_notation Helmholtz system]: octaves begin on the pitch "C," and end eleven tones higher on the pitch "B."  The octave below "middle C" (octave 3 in [http://en.wikipedia.org/wiki/Scientific_pitch_notation Scientific Pitch Notation]) has no commas or apostrophes.  The octave starting on "middle C" (octave 4) has one apostrophe; the octave above that (octave 5) has two apostrophes, and so on.  Octave 2 (starting two octaves below "middle C") has one comma, the octave below that has two commas, and so on.  It is usually not necessary to understand how to use this in LilyPond, or to be able to use it quickly, because most scores will use "relative mode."
+			When entering pitches absolutely, the register is indicated mostly as in the Helmholtz system (see <citetitle>Helmholtz Pitch Notation (Wikipedia)</citetitle> at <ulink url="http://en.wikipedia.org/wiki/Helmholtz_pitch_notation" />: octaves begin on the pitch "C," and end eleven tones higher on the pitch "B."  The octave below "middle C" (octave 3 in scientific pitch notation - see <citetitle>Scientific Pitch Notation (Wikipedia)</citetitle> at <ulink url="http://en.wikipedia.org/wiki/Scientific_pitch_notation Scientific Pitch Notation" />) has no commas or apostrophes.  The octave starting on "middle C" (octave 4) has one apostrophe; the octave above that (octave 5) has two apostrophes, and so on.  Octave 2 (starting two octaves below "middle C") has one comma, the octave below that has two commas, and so on.  It is usually not necessary to understand how to use this in LilyPond, or to be able to use it quickly, because most scores will use "relative mode."
 		</para>
 		<para>
 			When entering pitches relative to the previous one, the register is still indicated with commas or apostrophes, but usually none are needed.  When using this input mode, the octave of each note is guessed based on the octave of the previous note.  Think of it this way: the next note will always be placed so it produces the smaller interval.  For example, after a C, an E could be placed as a major third, a minor sixth, a major tenth, a minor thirteenth, and so on.  In relative mode, LilyPond will always choose the "major third" option.  If you wanted LilyPond to notate the E so that it's a minor sixth, you would tell LilyPond with a comma appended: <code>c e,</code> so that LilyPond knows what you want.  It's the same case if you were to input <code>c aes</code> (meaning "C then A-flat"): the A-flat will be notated so that it is a major third from the C; if you wanted LilyPond to notate it so that the A-flat is a minor sixth higher than the C, you would need to append an a
 postrophe: <code>c aes'</code>
@@ -345,7 +345,7 @@
 				The first bar has four half-notes, which is twice as many beats as are allowed.  LilyPond will print a warning at the first bar-check symbol.
 			</para>
 			<para>
-				You should always fix the first warning printed by LilyPond, then reprocess the file and fix remaining warnings.  One mistake sometimes triggers more than one bar-check warning, and fixing the first warning also fixe
+				You should always fix the first warning printed by LilyPond, then reprocess the file and fix remaining warnings.  One mistake sometimes triggers more than one bar-check warning, so if you fix the first warning, the rest may disappear.
 			</para>
 		</section>
 	</section>
diff --git a/en-US/LilyPond/LilyPond.xml b/en-US/LilyPond/LilyPond.xml
index 55b1cd9..b08a147 100644
--- a/en-US/LilyPond/LilyPond.xml
+++ b/en-US/LilyPond/LilyPond.xml
@@ -32,37 +32,35 @@
 	<section id="sect-Musicians_Guide-LilyPond-How_LilyPond_Works">
 		<title>How LilyPond Works</title>
 		<para>
-			LilyPond itself can be thought of as a highly-skilled automobile mechanic with a well-equipped toolbox.  Regardless of the problem presented to the mechanic, they will automatically know which tools to use.  Some of their tools are available elsewhere, and some are quite basic, but it is not the tools themselves that gives the mechanic their power, but rather the ways in which the mechanic uses the tools.  Similarly, LilyPond is run as a single program (as you would consult a single mechanic), but it employs a multitude of tools - some available elsewhere, but others specialized, so that the resulting output is useful to us in some way.
+			Think of LilyPond as an automobile mechanic.  When your car breaks down, the mechanic knows which tools to use.  You can buy tools to fix your car by yourself, but the mechanic is specialized.  The mechanic knows what tools to use, how to prepare the tools, and how to fix your car faster than you can fix it.  LilyPond uses many programs that you can use by yourself, but LilyPond is specialized.  LilyPond knows what programs to use, what settings to use, and most importantly, LilyPond takes much less time than if you use the programs directly.
 		</para>
 		<para>
-			In order to create meaningful output with LilyPond, we need to provide it meaningful instructions, and this we do in text files with the *.ly extension.  These instructions will tell LilyPond which tools to use, when, and with what settings.  Sometimes, we will need to change LilyPond's default settings; this is intimidating at first, but really we are just providing more specific instructions for what we want LilyPond to do.  To return to the mechanic metaphor: not only would you be able to ask for repairs, but you will be able to ask for specific components or specific methods of installation; once you realize the power, flexibility, and easiness that is changing LilyPond settings, you will have truly unleashed its power.
+			We give instructions to LilyPond in specially-formed text files.  LilyPond input files describe the music to notate.  LilyPond decides how the music will look, then creates an output file.  The input file does not contain instructions about how the music looks.  Sometimes you must make an adjustment to how the output file looks, so LilyPond lets you change the settings of its internal tools.
 		</para>
 	</section>
 	
 	<section id="sect-Musicians_Guide-LilyPond-The_LilyPond_Approach">
 		<title>The LilyPond Approach</title>
 		<para>
-			For an extensive explanation of the following section, please see [http://www.lilypond.org/about/automated-engraving/ the official LilyPond documentation], from where this was sourced.
+			For an extensive explanation of the following section, please see the <citetitle>LilyPond Website</citetitle> at <ulink url="http://www.lilypond.org/about/automated-engraving/" />, from where this was sourced.
 		</para>
 		<para>
-			??
-			LilyPond works by separating the tasks of what to put, and where to put it.  Each aspect of an object's positioning is controlled by a specific plug-in.  Although software using "plug-ins" often results in messy and uncooperative plug-ins, such is not the case with LilyPond.  You can think of the plug-ins as tools.  LilyPond knows how and when to use each tool; if it doesn't know enough about a tool, then it isn't used, so there's no concern about half-baked plug-ins that work for one person in one situation, but for nothing else.
-			??
+			LilyPond works by separating the tasks of what to put, and where to put it. Each aspect of an object's position is controlled by a specific plug-in. You can think of the plug-ins as tools. LilyPond knows how and when to use each tool; if it doesn't know enough about a tool, then it isn't used. 
 		</para>
 		<para>
-			Before LilyPond places an object, it first considers many different possibilities for the specific alignment and layout of that object.  Then it evaluates the possibilities according to aesthetic criteria set out to reflect those used in hand-engraved notation.  After assigning each possibility a score representing how closely it resembles to the aesthetic ideal, LilyPond then chooses the least problematic possibility.
+			Before LilyPond places an object, it first considers many different possibilities for the specific alignment and layout of that object. Then it evaluates the possibilities according to aesthetic criteria set out to reflect those used in hand-engraved notation. After assigning each possibility a score representing how closely it resembles to the aesthetic ideal, LilyPond then chooses the better possibility.
 		</para>
 	</section>
 	
 	<section id="sect-Musicians_Guide-LilyPond-Installation">
 		<title>Requirements and Installation</title>
 		<procedure>
-			<step><para>Run <code>sudo -c 'yum install lilypond'</code> or use PackageKit or KPackageKit to install the "lilypond" package.</para></step>
-			<step><para>Review the dependencies; it will want to install a lot of things called lilypond-*-fonts</para></step>
-			<step><para>LilyPond can be run from the command-line, as 'lilypond'</para></step>
+			<step><para>Use PackageKit or KPackageKit to install the <literal>lilypond</literal> package.</para></step>
+			<step><para>Review the dependencies.  Many packages called <literal>lilypond-*-fonts</literal> are installed.</para></step>
+			<step><para>LilyPond is run from the command line, with the command <command>lilypond</command>.</para></step>
 		</procedure>
 		<para>
-			It is recommended that you use the "Frescobaldi" text editor, which is designed specifically for use with LilyPond.  It has many features that help to enhance productivity when editing LilyPond files, and which greatly speed up the learning process.  Please see this section of the Musicians' Guide for help installing Frescobaldi.
+			We recommend that you use the Frescobaldi text editor, which is designed specifically for LilyPond. It has many features that enhance productivity when editing LilyPond files, and that greatly speed up the learning process. Refer to <xref linkend="chap-Musicians_Guide-Frescobaldi" /> for more information.
 		</para>
 	</section>
 	
diff --git a/en-US/Qtractor.xml b/en-US/Qtractor.xml
index a9a3f0e..23af616 100644
--- a/en-US/Qtractor.xml
+++ b/en-US/Qtractor.xml
@@ -22,13 +22,13 @@
 		<section id="sect-Musicians_Guide-Qtractor-Knowledge_Requirements">
 			<title>Knowledge Requirements</title>
 			<para>
-				Qtractor is easy to use, and its user interface is similar to other DAWs.  We recommend that you read !!L!!common interface!!L!! if you have not used a DAW before.
+				Qtractor is easy to use, and its user interface is similar to other DAWs.  We recommend that you read <xref linkend="sect-Musicians_Guide-DAW_User_Interface" /> if you have not used a DAW before.
 			</para>
 		</section>
 		<section id="sect-Musicians_Guide-Qtractor-Software_Requirements">
 			<title>Software Requirements</title>
 			<para>
-				Qtractor uses the JACK Audio Connection Kit.  You should install JACK before installing Qtractor.  Follow the instructions !!L!! here !!L!! to install JACK.
+				Qtractor uses the JACK Audio Connection Kit.  You should install JACK before installing Qtractor.  See <xref linkend="sect-Musicians_Guide-Install_and_Configure_JACK" /> for instructions to install JACK.
 			</para>
 		</section>
 		<section id="sect-Musicians_Guide-Qtractor-Hardware_Requirements">
@@ -40,13 +40,13 @@
 		<section id="sect-Musicians_Guide-Qtractor-Other_Requirements">
 			<title>Other Requirements</title>
 			<para>
-				You need a MIDI synthesizer to use Qtractor as a MIDI sequencer.  You can use hardware-based and software-based synthesizers with Qtractor.  We recommend using the software-based <code>FluidSynth</code> MIDI synthesizer.  See !!L!! this section !!L!! for information about <code>FluidSynth</code>.
+				You need a MIDI synthesizer to use Qtractor as a MIDI sequencer.  You can use hardware-based and software-based synthesizers with Qtractor.  We recommend using the software-based <code>FluidSynth</code> MIDI synthesizer.  See <xref linkend="chap-Musicians_Guide-FluidSynth" /> for information about <code>FluidSynth</code>.
 			</para>
 		</section>
 		<section id="sect-Musicians_Guide-Qtractor-Installation">
 			<title>Installation</title>
 			<para>
-				Qtractor is not available from the Fedora software repositories.  Qtractor is available from the "Planet CCRMA at Home" and "RPM Fusion" repositories.  If you have already enabled one of those repositories, you should install Qtractor from that repository.  If you have not already enabled one of those repositories, we recommend that you install Qtractor from the "Planet CCRMA at Home" repository.  See !!L!! this section !!L!! for instructions that enable the "Planet CCRMA at Home" repository.  The "Planet CCRMA at Home" repository contains a wide variety of music and audio applications.
+				Qtractor is not available from the Fedora software repositories.  Qtractor is available from the "Planet CCRMA at Home" and "RPM Fusion" repositories.  If you have already enabled one of those repositories, you should install Qtractor from that repository.  If you have not already enabled one of those repositories, we recommend that you install Qtractor from the "Planet CCRMA at Home" repository.  See <xref linkend="sect-Musicians_Guide-CCRMA_Installing_Repository" /> for instructions to enable the "Planet CCRMA at Home" repository.  The "Planet CCRMA at Home" repository contains a wide variety of music and audio applications.
 			</para>
 			<para>
 				After you enable the "RPM Fusion" or "Planet CCRMA at Home" repository, use PackageKit or KPackageKit to install the "qtractor" packge.  Other required software is installed automatically.
@@ -64,7 +64,7 @@
 			<title>Audio Options</title>
 			<!-- "Options Window: Audio Tab" -->
 			<para>
-				The ''Capture/Export'' setting allows you to choose the format in which Qtractor stores its audio clips when recorded or exported.  You will be able to choose a file type, such as "WAV Microsoft" for standard ".wav" files, "AIFF Apple-SGI" for standard ".aiff" files, or the preferable "FLAC Lossless Audio Codec," format.  FLAC is an open-source, lossless, compressed format for storing audio signals and metadata.  See  the [http://flac.sourceforge.net/ FLAC website] for more information.  You will also be asked to select a quality setting for lossy compressed formats, or a sample format for all lossless formats.  The sample format is sometimes called "bit rate," which is described [[User:Crantila/FSC/Sound_Cards#Bit_Rate|here]].  If you don't know which sample format to choose, then "Signed 16-Bit" is a good choice for almost all uses, and will provide you with CD-quality audio.  Most non-speciality hardware is incapable of making good use of higher sample formats.
+				The ''Capture/Export'' setting allows you to choose the format in which Qtractor stores its audio clips when recorded or exported.  You will be able to choose a file type, such as "WAV Microsoft" for standard ".wav" files, "AIFF Apple-SGI" for standard ".aiff" files, or the preferable "FLAC Lossless Audio Codec," format.  FLAC is an open-source, lossless, compressed format for storing audio signals and metadata.  See  the <citetitle>FLAC Website</citetitle> <ulink url="http://flac.sourceforge.net/" /> for more information.  You will also be asked to select a quality setting for lossy compressed formats, or a sample format for all lossless formats.  If you do not know which sample format to choose, then "Signed 16-Bit" is a good choice for almost all uses, and will provide you with CD-quality audio.  Most non-speciality hardware is incapable of making good use of higher sample formats.  See <xref linkend="sect-Musicians_Guide-Sample_Rate_and_Sample_Format" /> for more inf
 ormation about sample formats.
 			</para>
 			<para>
 				Setting the ''Transport mode'' will allow you to adjust the behaviour of the transport.
@@ -84,7 +84,7 @@
 			<title>MIDI Options</title>
 			<!-- "Options Window: MIDI Tab" -->
 			<para>
-				Adjusting the "File format" allows you to change how MIDI clips are stored.  You will not need to adjust this unless required by an external application.  The [http://en.wikipedia.org/wiki/Musical_Instrument_Digital_Interface#Standard_MIDI_.28.mid_or_.smf.29 Wikipedia article] about MIDI has further information about file formats.
+				Adjusting the "File format" allows you to change how MIDI clips are stored.  You will not need to adjust this unless required by an external application.  Refer to <citetitle>Musical Instrument Digital Interface: Standard MIDI (.mid or .smf) (Wikipedia)</citetitle> at <ulink url="http://en.wikipedia.org/wiki/Musical_Instrument_Digital_Interface#Standard_MIDI_.28.mid_or_.smf.29" /> for more information about file types.
 			</para>
 			<para>
 				"MMC" stands for "MIDI Machine Control," and it allows multiple MIDI-connected devices to interact and control each other.  Setting the ''Transport mode'' to a setting other than "None" allows it be controlled by MMC messages.
@@ -115,20 +115,6 @@
 		</section>
 	</section> <!-- Ends "Configuration" Section --> <!--    Qtractor-Configuration-    -->
 	
-	
-	
-	
-	
-	
-	
-	
-	
-	
-	
-	
-	
-	
-	
 	<section id="sect-Musicians_Guide-Qtractor-Using">
 		<title>Using Qtractor</title>
 		<para>
@@ -283,7 +269,7 @@
 					When creating a MIDI track, you can use the "omni" check-box to allow the track to respond to input from any MIDI channel.  If the check-box is unselected, the track will respond only to signals on its assigned MIDI channel.
 				</para>
 				<para>
-					In the matrix editor window, you can adjust the "velocity" (loudness) of a note by using the "Resize" MIDI Tool (see !!L!!"Using the Matrix Editor's MIDI Tools,"!!L!! above)
+					In the matrix editor window, you can adjust the "velocity" (loudness) of a note by using the "Resize" MIDI Tool (see <xref linkend="sect-Musicians_Guide-Qtractor-Using-MIDI_Tools" /> above)
 				</para>
 				<para>
 					If you find it difficult to work with Qtractor's matrix editor, but you find it easy to work with LilyPond, you can use this to your advantage.  LilyPond will output a MIDI-format representation of your score if you include a "midi" section in the "score" section.  It should look something like this:
@@ -327,12 +313,12 @@
 				<itemizedlist>
 				<listitem><para>A recording of the second movement from Beethoven's Piano Sonata No.23, "Appassionata," either:
 					<itemizedlist>
-					<listitem><para>[http://www.mutopiaproject.org/cgibin/make-table.cgi?searchingfor=passionata MIDI recording from Mutopia Project] (with LilyPond sheet music)</para></listitem>
-					<listitem><para><!--[http://www.musopen.com/music.php?type=piece&id=309--> Live recording from MusOpen]</para></listitem>
+					<listitem><para><citetitle>Mutopia</citetitle> at <ulink url="http://www.mutopiaproject.org/cgibin/make-table.cgi?searchingfor=appassionata" /> (MIDI synthesizer recording with LilyPond sheet music)</para></listitem>
+					<listitem><para><citetitle>MusOpen</citetitle> at <ulink url="http://www.musopen.com/music.php?type=piece&amp;id=309" /> (live recording)</para></listitem>
 					<listitem><para>The recording I used, played by Rudolf Serkin, available on the "Sony" label.</para></listitem>
 					</itemizedlist>
 					</para></listitem>
-				<listitem><para>[[User:Crantila/FSC/Synthesizers/<code>FluidSynth</code>|<code>FluidSynth</code>]]</para></listitem>
+				<listitem><para>You need to use FluidSynth, covered in <xref linkend="chap-Musicians_Guide-FluidSynth" />.</para></listitem>
 				</itemizedlist>
 			</para>
 		</section>
@@ -360,7 +346,7 @@
 			<para>
 				<orderedlist>
 				<listitem><para>Create a new audio track.</para></listitem>
-				<listitem><para>Right-click on the audio track, and go to ''Clip'' (a "Clip" in Qtractor is equivalent to a "Region" in Ardour) then ''Import''</para></listitem>
+				<listitem><para>Right-click on the audio track, and go to ''Clip'' then ''Import''</para></listitem>
 				<listitem><para>Locate the audio file that you want to import (in this case, I imported a recording of the second movement of Beethoven's Op.57 piano sonata, "Appassionata."</para></listitem>
 				<listitem><para>If the clip doesn't start at the beginning of the track, then click and drag it to the beginning.</para></listitem>
 				</orderedlist>
diff --git a/en-US/Real_Time_and_Low_Latency.xml b/en-US/Real_Time_and_Low_Latency.xml
index c5ced01..12bf22c 100644
--- a/en-US/Real_Time_and_Low_Latency.xml
+++ b/en-US/Real_Time_and_Low_Latency.xml
@@ -62,12 +62,12 @@
 	<section id="sect-Musicians_Guide-Getting_Real_Time_Kernel_in_Fedora">
 		<title>Getting a Real-Time Kernel in Fedora Linux</title>
 		<para>
-			In Fedora Linux, the real-time kernel is provided by the Planet CCRMA at Home software repositories.  Along with the warnings in the [[User:Crantila/FSC/CCRMA/Everything|Planet CCRMA section]], here is one more to consider: the real-time kernel is used by fewer people than the standard kernel, so it is less well-tested.  The changes of something going wrong are relatively low, but be aware that using a real-time kernel increases the level of risk.  Always leave a non-real-time option available, in case the real-time kernel stops working.
+			In Fedora Linux, the real-time kernel is provided by the Planet CCRMA at Home software repositories.  Along with the warnings in the Planet CCRMA at Home chapter (see <xref linkend="sect-Musicians_Guide-CCRMA_Security_and_Stability" />), here is one more to consider: the real-time kernel is used by fewer people than the standard kernel, so it is less well-tested.  The changes of something going wrong are relatively low, but be aware that using a real-time kernel increases the level of risk.  Always leave a non-real-time option available, in case the real-time kernel stops working.
 		</para>
 		<para>
 			You can install the real-time kernel, along with other system optimizations, by following these instructions:
 			<orderedlist>
-			<listitem><para>Install the Planet CCRMA at Home repositories by following [[User:Crantila/FSC/CCRMA/Everything#Using Planet CCRMA at Home Software|these instructions]].</para></listitem>
+			<listitem><para>Install the Planet CCRMA at Home repositories by following the instructions in <xref linkend="sect-Musicians_Guide-CCRMA_Installing_Repository" />.</para></listitem>
 			<listitem><para>Run the following command in a terminal: [pre]su -c 'yum install planetccrma-core'[/pre]  Note that this is a meta-package, which does not install anything by itself, but causes a number of other packages to be installed, which will themselves perform the desired installation and optimization.</para></listitem>
 			<listitem><para>Shut down and reboot your computer, to test the new kernel.  If you decided to modify your GRUB configuration, be sure that you leave a non-real-time kernel available for use.</para></listitem>
 			</orderedlist>
diff --git a/en-US/Revision_History.xml b/en-US/Revision_History.xml
index 6923da0..6969f27 100644
--- a/en-US/Revision_History.xml
+++ b/en-US/Revision_History.xml
@@ -68,6 +68,21 @@
 					</simplelist>
 				</revdescription>
 			</revision>
+			
+			<revision>
+				<revnumber>4</revnumber>
+				<date>Thu Aug 5 2010</date>
+				<author>
+					<firstname>Christopher</firstname>
+					<surname>Antila</surname>
+					<email>crantila at fedoraproject.org</email>
+				</author>
+				<revdescription>
+					<simplelist>
+						<member></member>
+					</simplelist>
+				</revdescription>
+			</revision>
 		</revhistory>
 	</simpara>
 </appendix>
diff --git a/en-US/Rosegarden.xml b/en-US/Rosegarden.xml
index 42b153e..bf2cd2f 100644
--- a/en-US/Rosegarden.xml
+++ b/en-US/Rosegarden.xml
@@ -13,13 +13,13 @@
 		<section id="sect-Musicians_Guide-Rosegarden-Knowledge_Requirements">
 			<title>Knowledge Requirements</title>
 			<para>
-				Rosegarden's user interface is similar to other DAWs.  We recommend that you read !!L!!common interface!!L!! if you have not used a DAW before.
+				Rosegarden's user interface is similar to other DAWs.  We recommend that you read <xref linkend="sect-Musicians_Guide-DAW_User_Interface" /> if you have not used a DAW before.
 			</para>
 		</section>
 		<section id="sect-Musicians_Guide-Rosegarden-Software_Requirements">
 			<title>Software Requirements</title>
 			<para>
-				Rosegarden uses the JACK Audio Connection Kit.  You should install JACK before installing Rosegarden.  Follow the instructions !!L!! here !!L!! to install JACK.
+				Rosegarden uses the JACK Audio Connection Kit.  You should install JACK before installing Rosegarden.  See <xref linkend="sect-Musicians_Guide-Install_and_Configure_JACK" /> for instructions to install JACK.
 			</para>
 		</section>
 		<section id="sect-Musicians_Guide-Rosegarden-Hardware_Requirements">
@@ -27,14 +27,11 @@
 			<para>
 				You need an audio interface to use Rosegarden.  If you will record audio with Rosegarden, you must have at least one microphone connected to your audio interface.  You do not need a microphone to record audio signals from other JACK-aware programs like <code>FluidSynth</code> and <code>SuperCollider</code>.
 			</para>
-			<para>
-				You need a MIDI synthesizer to use Rosegarden as a MIDI sequencer.  You can use hardware-based and software-based synthesizers with Rosegarden.  We recommend using the software-based <code>FluidSynth</code> MIDI synthesizer.  See !!L!! this section !!L!! for information about <code>FluidSynth</code>.
-			</para>
 		</section>
 		<section id="sect-Musicians_Guide-Rosegarden-Other_Requirements">
 			<title>Other Requirements</title>
 			<para>
-				You need a MIDI synthesizer to use Rosegarden as a MIDI sequencer.  You can use hardware-based and software-based synthesizers with Rosegarden.  We recommend using the software-based <code>FluidSynth</code> MIDI synthesizer.  See !!L!! this section !!L!! for information about <code>FluidSynth</code>.
+				You need a MIDI synthesizer to use Rosegarden as a MIDI sequencer.  You can use hardware-based and software-based synthesizers with Rosegarden.  We recommend using the software-based <code>FluidSynth</code> MIDI synthesizer.  See <xref linkend="chap-Musicians_Guide-FluidSynth" /> for information about <code>FluidSynth</code>.
 			</para>
 		</section>
 		<section id="sect-Musicians_Guide-Rosegarden-Installation">
@@ -54,7 +51,7 @@
 				<orderedlist>
 				<listitem><para>Start QjackCtl to control JACK.</para></listitem>
 				<listitem><para>Start Qsynth to control FluidSynth.</para></listitem>
-				<listitem><para>In order to receive MIDI input from Rosegarden, Qsynth will need to be configured to use the "alsa_seq" MIDI Driver.  Instructions for doing this can be found in [[User:Crantila/FSC/Synthesizers/FluidSynth#MIDI_Input_Configuration|this section]].</para></listitem>
+				<listitem><para>In order to receive MIDI input from Rosegarden, Qsynth will need to be configured to use the "alsa_seq" MIDI Driver.  Instructions for doing this can be found in <xref linkend="sect-Musicians_Guide-FluidSynth-Configuring-MIDI_Input" />.</para></listitem>
 				<listitem><para>You may want to disconnect all JACK connections except for those that you want to use with Rosegarden.  Open QjackCtl's "Connect" window, and verify the following:
 					<itemizedlist>
 					<listitem><para>On the "Audio" tab:
@@ -89,7 +86,7 @@
 				</orderedlist>
 			</para>
 			<para>
-				If a connection isn't being used, it's better to leave it disconnected, to avoid making mistakes.
+				If a connection is not being used, it is better to leave it disconnected, to avoid making mistakes.
 			</para>
 		</section>
 		<section id="sect-Musicians_Guide-Rosegarden-Configuration-Rosegarden">
diff --git a/en-US/Solfege.xml b/en-US/Solfege.xml
index e4e0ff7..8a8aa90 100644
--- a/en-US/Solfege.xml
+++ b/en-US/Solfege.xml
@@ -16,7 +16,7 @@
 				It is assumed that, prior to using GNU Solfege, users have already correctly configured their audio equipment.
 			</para>
 			<para>
-				In addition, the <code>timidity++</code> package is required by Solfege, which requires the installation of a large (approximately 140&nbsp;MB) SoundFont library.  This library is shared with the <code>FluidSynth</code> application, which has its own section in this Guide, and is used by several other software packages.  <code>timidity++</code> also requires the installation of the JACK Audio Connection Kit.  If you have installed the Planet CCRMA at Home repository, and have not yet followed the instructions to correctly install and configure its version of JACK, then it is recommended that you do so before installing GNU Solfege.  Instructions can be found [[User:Crantila/FSC/Sound_Servers#Installing_JACK|here]].
+				In addition, the <code>timidity++</code> package is required by Solfege, which requires the installation of a large (approximately 140&nbsp;MB) SoundFont library.  This library is shared with the <code>FluidSynth</code> application, which has its own section in this Guide, and is used by several other software packages.  <code>timidity++</code> also requires the installation of the JACK Audio Connection Kit.  If you have installed the Planet CCRMA at Home repository, and have not yet followed the instructions to correctly install and configure its version of JACK, then it is recommended that you do so before installing GNU Solfege.  Refer to <xref linkend="sect-Musicians_Guide-Install_and_Configure_JACK" /> for installations to install JACK.
 			</para>
 		</section>
 		<section id="sect-Musicians_Guide-Solfege-Other_Requirements">
@@ -44,9 +44,9 @@
 			</para>
 		</section>
 		<section id="sect-Musicians_Guide-Solfege-Install_MMA">
-			<title>Optionsl Installation: MMA</title>
+			<title>Optional Installation: MMA</title>
 			<para>
-				MMA stands for "Musical MIDI Accompaniment," and it is not available for Fedora in a pre-packaged format.  The software can be found on the internet from [http://www.mellowood.ca/mma/ here], where you can download the source code and compile it if desired.  MMA is only used by some of the harmonic dictation questions, so its installation is not required.
+				MMA stands for "Musical MIDI Accompaniment," and it is not available for Fedora in a pre-packaged format.  The software can be found on the <citetitle>MMA Homepage</citetitle> at <ulink url="http://www.mellowood.ca/mma/" />, where you can download the source code and compile it if desired.  MMA is only used by some of the harmonic dictation questions, so its installation is not required.
 			</para>
 		</section>
 	</section> <!-- Ends "Requirements and Installation" Section -->
@@ -123,7 +123,7 @@
 				Miscellaneous:
 				<itemizedlist>
 				<listitem><para>CSound: Solfege uses CSound for intonation exercises.  It is an optional component.  See the "Optional Installation" section above.</para></listitem>
-				<listitem><para>MMA: Solfege uses MMA for certain harmonic dictation exercises.  It is an optional component, and not available in Fedora through standard means.  See the project's [http://www.mellowood.ca/mma/ home page] for details.</para></listitem>
+				<listitem><para>MMA: Solfege uses MMA for certain harmonic dictation exercises.  It is an optional component, and not available in Fedora through standard means.  See <xref linkend="sect-Musicians_Guide-Solfege-Install_MMA" /></para></listitem>
 				<listitem><para>Lilypond-book: Solfege uses this for generating print-outs of ear training exercise progress.  See the "Optional Installation" section above.</para></listitem>
 				<listitem><para>Latex: Solfege uses this for generating *.dvi format progress reports, rather than the default HTML format.</para></listitem>
 				<listitem><para>Latex: Solfege uses this for generating *.dvi format progress reports, rather than the default HTML format.</para></listitem>
@@ -211,7 +211,7 @@
 					<itemizedlist>
 					<listitem><para>Intonation, which tests your ability to perceive and identify whether a second pitch is lower or higher than it should be.</para></listitem>
 					<listitem><para>Dictation, which tests your ability to perceive and notate melodies.</para></listitem>
-					<listitem><para>Identify Tone, which tests your ability to use [http://en.wikipedia.org/wiki/Relative_pitch relative pitch].</para></listitem>
+					<listitem><para>Identify Tone, which tests your ability to use relative pitch (see <citetitle>Relative Pitch (Wikipedia)</citetitle> at <ulink url="http://en.wikipedia.org/wiki/Relative_pitch" /> for more information).</para></listitem>
 					<listitem><para>Sing Twelve Random Tones, which tests many skills.</para></listitem>
 					<listitem><para>Beats per Minute, which tests your ability to determine a tempo.</para></listitem>
 					<listitem><para>Harmonic Progressions, which tests your ability to perceive and apply Roman numeral chord symbols for a series of chords played together.</para></listitem>
@@ -409,7 +409,7 @@
 		<section id="sect-Musicians_Guide-Solfege-Rhythm_Exercises">
 			<title>Rhythm</title>
 			<para>
-				This is dictation or play-back.  The rhythms described in this section use the "takadimi" rhythm system, which is explained [http://www.takadimi.net/takadimiArticle.html here].  You can use any rhythm system you prefer.
+				This is dictation or play-back.  The rhythms described in this section use the "takadimi" rhythm system, which is explained in <citetitle>The Takadimi Article</citetitle>, available at <ulink url="http://www.takadimi.net/takadimiArticle.html" />.  Use the rhythm system you prefer.
 			</para>
 			<para>
 				For Rhythmic Dictation:
@@ -492,7 +492,7 @@
 		<section id="sect-Musicians_Guide-Solfege-Intonation">
 			<title>Intonation</title>
 			<para>
-				In order to use the Intonation exercises, you must install the "Csound" application.  Instructions are located [[User:Crantila/FSC/Solfege#Optional Installation: Csound|here]].
+				In order to use the Intonation exercises, you must install the "Csound" application.  See <xref linkend="sect-Musicians_Guide-Solfege-Install_Csound" /> for instructions to install Csound.
 			</para>
 			<procedure>
 				<step><para>Click on "Intonation"</para></step>
diff --git a/en-US/Sound_Cards.xml b/en-US/Sound_Cards.xml
index 22199c5..1eb4b09 100644
--- a/en-US/Sound_Cards.xml
+++ b/en-US/Sound_Cards.xml
@@ -33,7 +33,7 @@
 				Musical Instrument Digital Interface (MIDI) is a standard used to control digital musical devices.  Many people associate the term with low-quality imitations of acoustic instruments.  This is unfortunate, because MIDI signals themselves do not have a sound.  MIDI signals are instructions to control devices: they tell a synthesizer when to start and stop a note, how long the note should be, and what pitch it should have.  The synthesizer follows these instructions and creates an audio signal.  Many MIDI-controlled synthesizers are low-quality imitations of acoustic instruments, but many are high-quality imitations.  MIDI-powered devices are used in many mainstream and non-mainstream musical situations, and can be nearly indistinguishable from actual acoustic instruments.  MIDI interfaces only transmit MIDI signals, not audio signals.  Some audio interfaces have built-in MIDI interfaces, allowing both interfaces to share the same physical device.
 			</para>
 			<para>
-				In order to create sound from MIDI signals, you need a "MIDI synthesizer."  Some MIDI synthesizers have dedicated hardware, and some use only software.  A software-only MIDI synthesizer, based on SoundFont technology, is discussed in the !!L!! FluidSynth Section !!L!! of the Musicians' Guide.
+				In order to create sound from MIDI signals, you need a "MIDI synthesizer."  Some MIDI synthesizers have dedicated hardware, and some use only software.  A software-only MIDI synthesizer, based on SoundFont technology, is discussed in <xref linkend="chap-Musicians_Guide-FluidSynth" />
 			</para>
 			<para>
 				You can use MIDI signals, synthesizers, and applications without a hardware-based MIDI interface.  All of the MIDI-capable applications in the Musicians' Guide work well with software-based MIDI solutions, and are also compatible with hardware-based MIDI devices.
@@ -68,7 +68,7 @@
 				FireWire-connected sound cards are not as popular as USB-connected sound cards, but they are generally higher quality.  This is partly because FireWire-connected sound cards use FireWire's "guaranteed bandwidth" and "bus-mastering" capabilities, which both reduce latency.  High-speed FireWire connections are also available on older computers without a high-speed USB connection.
 			</para>
 			<para>
-				FireWire devices are sometimes incompatible with the standard Fedora Linux kernel.  If you have a FireWire-connected sound card, you should use the kernel from Planet CCRMA at Home.  Installation instructions for this kernel are available !!L!! here !!L!! .
+				FireWire devices are sometimes incompatible with the standard Fedora Linux kernel.  If you have a FireWire-connected sound card, you should use the kernel from Planet CCRMA at Home.  Refer to <xref linkend="sect-Musicians_Guide-Getting_Real_Time_Kernel_in_Fedora" /> for instructions to install the Planet CCRMA at Home kernel.
 			</para>
 		</section>
 		<section id="sect-Musicians_Guide-USB_Sound_Cards">
@@ -90,7 +90,8 @@
 	<section id="sect-Musicians_Guide-Sample_Rate_and_Sample_Format">
 		<title>Sample, Sample Rate, Sample Format, and Bit Rate</title>
 		<para>
-			The primary function of audio interfaces is to convert signals between analog and digital formats.  As mentioned earlier, real sound has an infinite possibility of pitches, volumes, and durations.  Computers cannot process infinite information, so the audio signal must be converted before they can use it.  This diagram from Wikipedia illustrates the situation: [http://en.wikipedia.org/wiki/File:Pcm.svg here].  The red wave shape represents a sound wave that could be produced by a singer or an acoustic instrument.  The gradual change of the red wave cannot be processed by a computer, which must use an approximation, represented by the gray, shaded area of the diagram.  This diagram is an exaggerated example, and it does not represent a real recording.
+			!!P!! [https://fedoraproject.org/wiki/File:FMG-PCM_from_Wikipedia.svg]
+			The primary function of audio interfaces is to convert signals between analog and digital formats.  As mentioned earlier, real sound has an infinite possibility of pitches, volumes, and durations.  Computers cannot process infinite information, so the audio signal must be converted before they can use it.  This diagram above illustrates the situation.  The red wave shape represents a sound wave that could be produced by a singer or an acoustic instrument.  The gradual change of the red wave cannot be processed by a computer, which must use an approximation, represented by the gray, shaded area of the diagram.  This diagram is an exaggerated example, and it does not represent a real recording.
 		</para>
 		<para>
 			The conversion between analog and digital signals distinguishes low-quality and high-quality audio interfaces.  The sample rate and sample format control the amount of audio information that is stored by the computer.  The greater the amount of information stored, the better the audio interface can approximate the original signal from the microphone.  The possible sample rates and sample formats only partially determine the quality of the sound captured or produced by an audio interface.  For example, an audio interface integrated into a motherboard may be capable of a 24-bit sample format and 192&nbsp;kHz sample rate, but a professional-level, FireWire-connected audio interface capable of a 16-bit sample format and 44.1&nbsp;kHz sample rate may sound better.
diff --git a/en-US/Sound_Servers.xml b/en-US/Sound_Servers.xml
index b479f2f..f92a694 100644
--- a/en-US/Sound_Servers.xml
+++ b/en-US/Sound_Servers.xml
@@ -6,7 +6,7 @@
 
 <chapter id="chap-Musicians_Guide-How_Computers_Deal_with_Hardware">
 	<title>Software for Sound Cards</title>
-	<para>
+	<para> <!-- TODO: this description could use an extensive re-working, possibly with a graphical example -->
 		One of the techniques consistently used in computer science is abstraction.  Abstraction is the process of creating a generic model for something (or some things) that are actually unique.  The "driver" for a hardware device in a computer is one form of dealing with abstraction: the computer's software interacts with all sound cards in a similar way, and it is the driver which translates the universal instructions given by the software into specific instructions for operating that hardware device.  Consider this real-world comparison: you know how to operate doors because of abstracted instructions.  You don't know how to open and close every door that exists, but from the ones that you do know how to operate, your brain automatically creates abstracted instructions, like "turn the handle," and "push the door," which apply with all or most doors.  When you see a new door, you have certain expectations about how it works, based on the abstract behaviour of doors, and you qu
 ickly figure out how to operate that specific door with a simple visual inspection.  The principle is the same with computer hardware drivers: since the computer already knows how to operate "sound cards," it just needs a few simple instructions (the driver) in order to know how to operate any particular sound card.
 	</para>
 	
@@ -39,7 +39,7 @@
 		<section id="sect-Musicians_Guide-Sound_Servers-JACK">
 			<title>JACK Audio Connection Kit</title>
 			<para>
-				The JACK sound server offers fewer features than other sound servers, but they are tailor-made to allow the functionality required by audio creation applications.  JACK also makes it easier for users to configure the options that are most important for such situations.  The server supports only one sample rate and format at a time, and allows applications and hardware to easily [[User:Crantila/FSC/Sound_Cards#Routing_and_Multiplexing|connect and multiplex]] in ways that other sound servers do not.  It is also optimized to run with consistently low latencies.  Although using JACK requires a better understanding of the underlying hardware, the "QjackCtl" application provides a graphical user interface to ease the process.
+				The JACK sound server offers fewer features than other sound servers, but they are tailor-made to allow the functionality required by audio creation applications.  JACK also makes it easier for users to configure the options that are most important for such situations.  The server supports only one sample rate and format at a time, and allows applications and hardware to easily connect and multiplex in ways that other sound servers do not (see <xref linkend="sect-Musicians_Guide-Vocabulary-Routing_and_Multiplexing" /> for information about routing and multiplexing).  It is also optimized to run with consistently low latencies.  Although using JACK requires a better understanding of the underlying hardware, the "QjackCtl" application provides a graphical user interface to ease the process.
 			</para>
 		</section>
 		<section id="sect-Musicians_Guide-Sound_Servers-Phonon">
@@ -52,14 +52,12 @@
 		
 	<section id="sect-Musicians_Guide-Using_JACK">
 		<title>Using the JACK Audio Connection Kit</title>
-		<para>
-			!!I!! What to say here depends on whether jack2 will be available with Fedora 14.  If it is, no need for CCRMA solution.  If it isn't, need for CCRMA solution. !!I!!
-		</para>
+		<!-- TODO: Rewrite this section, knowing that jack2 will be in Fedora 14 -->
 		<section id="sect-Musicians_Guide-Install_and_Configure_JACK">
 			<title>Installing and Configuring JACK</title>
 			<para>
 				<orderedlist>
-				<listitem><para>Ensure that you have installed the Planet CCRMA at Home repositories.  For instructions, refer to [[User:Crantila/FSC/CCRMA/Everything#Installing_the_Repository|this section]].</para></listitem>
+				<listitem><para>Ensure that you have installed the Planet CCRMA at Home repositories.  For instructions, refer to <xref linkend="sect-Musicians_Guide-CCRMA_Installing_Repository" />.</para></listitem>
 				<listitem><para>Use PackageKit or KPackageKit to install the "jack-audio-connection-kit" and "qjackctl" packages, or run the following command in a terminal: [pre]sudo -c 'yum install jack-audio-connection-kit qjackctl'[/pre]</para></listitem>
 				<listitem><para>Review and approve the installation, making sure that it completes correctly.</para></listitem>
 				<listitem><para>Run QjackCtl from the KMenu or the Applications menu.</para></listitem>
@@ -72,7 +70,7 @@
 			<para>
 				JACK will operate without following this procedure, but users are strongly encouraged to follow these three steps, for security reasons.  They will help to allow optimal performance of the JACK sound server, while greatly reducing the risk that an application or user will accidentally or malicious take advantage of the capability.
 				<orderedlist>
-				<listitem><para>Add all of the users who will use JACK to the "audio" group.  For help with this, see the !!L!!Deployment Guide, Chapter 22!!L!!</para></listitem>
+				<listitem><para>Add all of the users who will use JACK to the "audio" group.  For instructions to add users to groups, see Chapter 22, <citetitle>Users and Groups</citetitle> of the <citetitle>Fedora Deployment Guide</citetitle>, available at <ulink url="http://docs.fedoraproject.org" />.</para></listitem>
 				<listitem><para>The default installation automatically enables real-time priority to be requested by any user or process.  This is undesirable, so we will edit it.
 					<orderedlist>
 					<listitem><para>Open a terminal, and run the following command: [pre]sudo -c 'gedit /etc/security/limits.conf'[/pre]</para></listitem>
@@ -87,6 +85,7 @@
 			<para>
 				With the default configuration of QjackCtl, it chooses the "default" sound card, which actually goes through the ALSA sound server.  We can avoid this, and use the ALSA drivers without the sound server, which will help JACK to maintain accurately low latencies.  The following procedure configures JACK to connect to the ALSA driver directly.
 			</para>
+			<!-- Part of this procedure appears in the Audacity chapter.  Keep them in sync. -->
 			<procedure>
 				<step><para>Open a terminal.  In GNOME, click on 'Applications > System > Terminal'.  In KDE, click on the application launcher, then 'System > Konsole'.</para></step>
 				<step><para>Execute this command: "cat /proc/asound/cards".</para></step>
@@ -111,7 +110,7 @@
 				The QjackCtl application offers many more features and configuration options.  The patch bay is a notable feature, which lets users save configurations of the "Connections" window, and restore them later, to help avoid the lengthy set-up time that might be required in complicated routing and multiplexing situations.
 			</para>
 			<para>
-				For more information on QjackCtl, you can refer to [http://www.64studio.com/manual/audio/jack this] web site.
+				For more information on QjackCtl, refer to <citetitle>Jack Audio Connection Kit (64studio)</citetitle> at <ulink url="http://www.64studio.com/manual/audio/jack" />.
 			</para>
 		</section>
 		<section id="sect-Musicians_Guide-Integrating_PulseAudio_with_JACK">
diff --git a/en-US/SuperCollider/SuperCollider-Basic_Programming.xml b/en-US/SuperCollider/SuperCollider-Basic_Programming.xml
index a2525d5..3bbb2a4 100644
--- a/en-US/SuperCollider/SuperCollider-Basic_Programming.xml
+++ b/en-US/SuperCollider/SuperCollider-Basic_Programming.xml
@@ -479,8 +479,10 @@
 			</para>
 			<para>
 				The following example is an extension of the "Imperative" example.  Pretend that the following Functions exist, and do the following tasks:
-				* getinput : allows the user to enter a number, and returns that number
-				* add : adds together the numbers given as arguments, returning the sum
+				<itemizedlist>
+				<listitem><para>getinput : allows the user to enter a number, and returns that number</para></listitem>
+				<listitem><para>add : adds together the numbers given as arguments, returning the sum</para></listitem>
+				</itemizedlist>
 				[pre]
 				(
 				   postln( add( getinput, getinput ) );
@@ -518,12 +520,12 @@
 				If this kind of programming is new to you, it might seem extremely difficult.  It can be intimidating at first, but it is actually not too difficult to understand once you start to use it.  In fact, you have already been using it!  Remember the <code>postln</code> command that was described earlier as a special kind of Function?  It's actually a Function defined by SuperCollider's abstract class <code>Object</code>, which defines a set of messages that can be passed to ''any'' SuperCollider Object.  Because most things in SuperCollider are Objects, we can send them the <code>postln</code> message, and they will understand that it means to print themselves in the "SuperCollider output" pane.
 			</para>
 			<para>
-				<!-- Begin Confusing??? -->
+				<!-- Begin Confusing??? (TODO) -->
 				Why is it that all Objects respond to the <code>postln</code> message?  SuperCollider classes are allowed to belong to other SuperCollider classes, of which they are a part.  Consider the Bicycle class again.  It is a kind of vehicle, and philosophers might say that "things that are members of the bicycle class are also members of the vehicle class."  That is, real-world bicycles share certain characteristics with other real-world objects that are classified as "vehicles."  The bicycle class is a "sub-class" of the vehicle class, and it '''inherits''' certain properties from the vehicles class.  SuperCollider allows this behaviour too, and calls it '''inheritance'''.  In SuperCollider, since all classes define Objects, they are all automatically considered to be a sub-class of the class called <code>Object</code>.  All classes therefore inherit certain characteristics from the <code>Object</code> class, like knowing how to respond to the <code>postln</code> message.
 				<!-- End Confusing??? -->
 			</para>
 			<para>
-				equivalent notation: 5.postln versus postln( 5 )
+				equivalent notation: <literal>5.postln</literal> versus <literal>postln( 5 )</literal>
 			</para>
 			<para>
 				You still don't know how to write new Classes and Objects in SuperCollider, but knowing how to use them is more than enough for now.  By the time you need to write your own Classes, you will probably prefer to use the official SuperCollider help files, anyway.
@@ -625,7 +627,7 @@
 				   play( sound );
 				)
 				[/pre]
-				WHY DOESN'T THAT WORK?!?!?!?!?!?
+				<!-- WHY DOESN'T THAT WORK?!?!?!?!?!? (TODO) -->
 			</para>
 		</section>
 	</section> <!-- Ends "Sound-Making Functions" Section --> <!--    SC-Basic_Programming-Sound_Making_Functions-    -->
@@ -1200,7 +1202,7 @@
 				[pre]
 				testFunc.while( bodyFunc );
 				[/pre]
-				The test condition, called <code>testFunc</code>, is a Function which returns a boolean value - either "true" or "false".  The loop's body, called <code>bodyFunc</code>, is a Function which can do anything.  The loop body function is not provided any arguments by the interpreter.  You will have to use comparison operators and boolean expressions when writing the Function for the test condition.  For information on how these work in SuperCollider, see !!L!! THIS SECTION HERE !!L!! (#Boolean_Operators and #Boolean_Expressions)?
+				The test condition, called <code>testFunc</code>, is a Function which returns a boolean value - either "true" or "false".  The loop's body, called <code>bodyFunc</code>, is a Function which can do anything.  The loop body function is not provided any arguments by the interpreter.  You will have to use comparison operators and boolean expressions when writing the Function for the test condition.  For information on how these work in SuperCollider, see <xref linkend="sect-Musicians_Guide-SC-Basic_Programming-Conditional_Execution-Boolean_Operators" /> and <xref linkend="sect-Musicians_Guide-SC-Basic_Programming-Conditional_Execution-Boolean_Expressions" />.
 			</para>
 			<para>
 				The following three code blocks are equivalent:
@@ -1414,7 +1416,7 @@
 			<section id="sect-Musicians_Guide-SC-Basic_Programming-Conditional_Execution-Order_of_Precedence">
 				<title>Order of Precedence</title>
 				<para>
-					In complicated boolean expressions, it's important to clarify the order in which you want sub-expressions to be executed.  This order is called the "order of precedence," or [http://en.wikipedia.org/wiki/Order_of_operations "order of operations."]  In computer science, different programming languages enforce different orders of precedence, so you should use parentheses to clarify your intended order, to proactively avoid later confusion.  The interpreter will evaluate an expression from left to right, and always fully evaluate parentheses before continuing.
+					In complicated boolean expressions, it's important to clarify the order in which you want sub-expressions to be executed.  This order is called the "order of precedence," or "order of operations" (see <citetitle>Order of Operations (Wikipedia)</citetitle>, available at <ulink url="http://en.wikipedia.org/wiki/Order_of_operations" /> for more information).  In computer science, different programming languages enforce different orders of precedence, so you should use parentheses to clarify your intended order, to proactively avoid later confusion.  The interpreter will evaluate an expression from left to right, and always fully evaluate parentheses before continuing.
 				</para>
 				<para>
 					Even simple expression can benefit from parentheses.  These produce the same results:
@@ -1816,7 +1818,7 @@
 		<section id="sect-Musicians_Guide-SC-Basic_Programming-SynthDef_and_Synth-Out_UGen">
 			<title>"Out" UGen</title>
 			<para>
-				The "Out" UGen is one of the bits of magic automatically taken care of by the interpreter.  It routes an audio signal from another UGen into a specific output (actually, into a specific bus - explained in the !!L!! BUSSES !!L!! section).
+				The "Out" UGen is one of the bits of magic automatically taken care of by the interpreter.  It routes an audio signal from another UGen into a specific output (actually, into a specific bus - see <xref linkend="sect-Musicians_Guide-SC-Basic_Programming-Busses" />).
 			</para>
 			<para>
 				The following examples have the same effect:
@@ -1827,7 +1829,7 @@
 				[pre]
 				{ Out.ar( 0, SinOsc.ar( freq:500, mul:0.2 ) ); }.play;
 				[/pre]
-				The first argument to "Out.ar" is the bus number for where you want to place the second argument, which is either a UGen or a multi-channel Array of UGen's.  If the second argument is an Array, then the first element is sent to the first argument's bus number, the second argument is sent to one bus number higher, the third to two bus numbers higher, and so on.  This issues is explained fully in the !!L!! BUSSES !!L!! section, but here's what you need to know for now, working with stereo (two-channel) audio:
+				The first argument to "Out.ar" is the bus number for where you want to place the second argument, which is either a UGen or a multi-channel Array of UGen's.  If the second argument is an Array, then the first element is sent to the first argument's bus number, the second argument is sent to one bus number higher, the third to two bus numbers higher, and so on.  This issues is explained fully in <xref linkend="sect-Musicians_Guide-SC-Basic_Programming-Busses" />, but here is what you need to know for now, working with stereo (two-channel) audio:
 				<itemizedlist>
 				<listitem><para>If the second argument is a two-element Array, use bus number 0.</para></listitem>
 				<listitem><para>If the second argument is a single UGen, and you want it to be heard through the left channel, use bus number 0.</para></listitem>
@@ -1835,7 +1837,7 @@
 				</itemizedlist>
 			</para>
 			<para>
-				If you're still struggling with exactly what the "Out" UGen does, think of it like this: when you create an audio-rate UGen, it starts creating an audio signal; the "Out" UGen effectively connects the audio-rate UGen into your audio interface's output port, so it can be heard through the speakers.  In the !!L! BUSSES !!L!! section, it becomes clear that there are, in fact, other useful places to connect an audio-rate UGen (through an effect processor, for example), and the "Out" UGen can help you do that.
+				If you're still struggling with exactly what the "Out" UGen does, think of it like this: when you create an audio-rate UGen, it starts creating an audio signal; the "Out" UGen effectively connects the audio-rate UGen into your audio interface's output port, so it can be heard through the speakers.  In <xref linkend="sect-Musicians_Guide-SC-Basic_Programming-Busses" />, it becomes clear that there are, in fact, other useful places to connect an audio-rate UGen (through an effect processor, for example), and the "Out" UGen can help you do that.
 			</para>
 		</section>
 		<section id="sect-Musicians_Guide-SC-Basic_Programming-SynthDef_and_Synth-SynthDef">
@@ -1865,7 +1867,7 @@
 				[pre]
 				SynthDef.new( nameOfSynthDef, FunctionContainingOutUGen ).send( nameOfServer );
 				[/pre]
-				The <code>FunctionContainingOutUGen</code> is simply that - a Function that, when executed, returns an "Out" UGen (meaning that the "Out" UGen must be the last expression in the Function).  The <code>nameOfSynthDef</code> should be a symbol (described !!L!!BELOW!!L!!), but can also be a string.  The <code>nameOfServer</code> is a variable that represents the server to which you want to send the SynthDef's information; unless you know that you need to use a different variable for this, it's probably just the letter "s", which the interpreter automatically assigns to the default server.
+				The <code>FunctionContainingOutUGen</code> is simply that - a Function that, when executed, returns an "Out" UGen (meaning that the "Out" UGen must be the last expression in the Function).  The <code>nameOfSynthDef</code> should be a symbol (as described in <xref linkend="sect-Musicians_Guide-SC-Basic_Programming-SynthDef_and_Synth-Symbols" />), but can also be a string.  The <code>nameOfServer</code> is a variable that represents the server to which you want to send the SynthDef's information; unless you know that you need to use a different variable for this, it's probably just the letter "s", which the interpreter automatically assigns to the default server.
 			</para>
 			<para>
 				Here is a demonstration of both methods:
@@ -2223,7 +2225,7 @@
 		<section id="sect-Musicians_Guide-SC-Basic_Programming-Busses-Out_and_In_UGens">
 			<title>Out and In UGens</title>
 			<para>
-				The "Out" UGen is discussed in the !!L!!SynthDef section!!L!!.  What it does is take a signal and route it to the specified bus number.  The "In" UGen performs a similar action: take a signal from a bus number, and make it available for use.
+				The "Out" UGen is discussed in <xref linkend="sect-Musicians_Guide-SC-Basic_Programming-SynthDef_and_Synth-Out_UGen" />.  What it does is take a signal and route it to the specified bus number.  The "In" UGen performs a similar action: take a signal from a bus number, and make it available for use.
 			</para>
 			<para>
 				This is the syntax to use for "Out":
@@ -2342,7 +2344,7 @@
 				<listitem><para>SynthDef: These commands are straight-forward.  They send the synthesis definitions to the server.</para></listitem>
 				<listitem><para><code>b = Bus.control( s );</code> : This should also be straight-forward.  A single-channel control bus is created, and assigned to the pre-declared variable "b".</para></listitem>
 				<listitem><para>For synth creation, x is assigned a control-rate synth, while y and z are assigned audio-rate synths.  Each synth is given the variable "b", which refers to our control-rate bus.  "z" is also given an argument for \freqOffset, which makes its frequency 200 Hz higher than the synth assigned to "y".</para></listitem>
-				<listitem><para>Don't worry about the "after" message for now.  It's explained in !!L!!the section about Ordering!!L!!</para></listitem>
+				<listitem><para>Don't worry about the "after" message for now.  It's explained in <xref linkend="sect-Musicians_Guide-C-Basic_Programming-Ordering_and_Other_Features-Ordering" />.</para></listitem>
 				</itemizedlist>
 			</para>
 			<section id="sect-Musicians_Guide-SC-Basic_Programming-Busses-Control_Rate_Example-Global_Variables">
@@ -2357,11 +2359,13 @@
 					The control-rate bus in this example might seem trivial and pointless to you, especially since the use of a UGen to control frequency has already been illustrated in other examples.  For this particular program, a control-rate UGen would probably have been a better choice, but remember that this is just an example.
 				</para>
 				<para>
-					Here are some main advantages to using a control-rate Bus over a UGen:
-					# The signal can be changed without sending the "set" message to the audio-rate UGen, simply by changing the input to the bus.
-					# Input to the bus can be produced by any number of control-rate UGen's.
-					# The signal in the bus can be received by more than one UGen, as it is in this example.  One thousand audio-rate UGen's powered by 25 control-rate UGen's is a much better solution than if each audio-rate UGen were powered by its own control-rate UGen.
-					# Busses can be accessed quickly and efficiently from any place in the program that has access to the variable holding the Bus.  It's easier and safer (less error-prone) than making all of your UGen's equally accessible.
+					Here are some advantages to using a control-rate Bus over a UGen:
+					<itemizedlist>
+					<listitem><para>The signal can be changed without sending the "set" message to the audio-rate UGen, simply by changing the input to the bus.</para></listitem>
+					<listitem><para>Input to the bus can be produced by any number of control-rate UGen's.</para></listitem>
+					<listitem><para>The signal in the bus can be received by more than one UGen, as it is in this example.  One thousand audio-rate UGen's powered by 25 control-rate UGen's is a much better solution than if each audio-rate UGen were powered by its own control-rate UGen.</para></listitem>
+					<listitem><para>Busses can be accessed quickly and efficiently from any place in the program that has access to the variable holding the Bus.  It's easier and safer (less error-prone) than making all of your UGen's equally accessible.</para></listitem>
+					</itemizedlist>
 				</para>
 				<para>
 					Some of these advantages could be seen as disadvantages.  Whether you should use a Bus or a UGen depends on the particular application.  The simpler solution is usually the better one, as long as you remember to avoid repetition!
@@ -2495,7 +2499,7 @@
 		<section id="sect-Musicians_Guide-C-Basic_Programming-Ordering_and_Other_Features-Ordering">
 			<title>Ordering</title>
 			<para>
-				Ordering is instructing the server to calculate in a particular order.  The audio synthesized by the server takes the same form as any other digital audio: a series of samples are played at a particular speed (called !!L!!sample rate!!L!!), each with a set number of bits per sample (called !!L!!sample format!!L!!).  For each sample, the server calculates the signal at that point in a pre-determined order.  Each sample is calculated from scratch, so if a particular UGen depends on the output of another UGen, the other one had better be calculated first.
+				Ordering is instructing the server to calculate in a particular order.  The audio synthesized by the server takes the same form as any other digital audio: a series of samples are played at a particular speed (called sample rate), each with a set number of bits per sample (called sample format).  For each sample, the server calculates the signal at that point in a pre-determined order.  Each sample is calculated from scratch, so if a particular UGen depends on the output of another UGen, the other one had better be calculated first.  For more information on samples, sample rate, and sample format, see <xref linkend="sect-Musicians_Guide-Sample_Rate_and_Sample_Format" />.
 			</para>
 			<para>
 				Consider the following example:
@@ -2515,7 +2519,7 @@
 				[/pre]
 				And it works just as it looks, too: the server creates a new synth, adds it before or after the synth represented by "variableHoldingSynth" (depending on which Function you use), and uses "nameOfSynthDef" and "ListOfArguments" just as in the "add" method.
 				
-				This example, from the !!L!!"Bus"!!L!! section, uses the "after" Function to ensure that the control-rate synth is calculated before the audio-rate synths that depend on it.
+				This example, from <xref linkend="sect-Musicians_Guide-SC-Basic_Programming-Busses-Control_Rate_Bus_Example" />, uses the "after" Function to ensure that the control-rate synth is calculated before the audio-rate synths that depend on it.
 				[pre]
 				( // execute first: prepare the server
 				   var busAudioSynth = 
@@ -2554,7 +2558,7 @@
 				In this case, the control-rate synth is created before the audio-rate synths - probably the easier way to think about it.  Even so, it's possible to add them in the opposite order with a little extra thought.
 			</para>
 			<para>
-				The other example from the !!L!!"Bus"!!L!! section used the "before" Function to ensure that the "pink noise" and "sine wave" UGen's were calculated before the "reverberation" UGen.  Especially since these are all audio-rate UGen's, the server would not reasonably know which to calculate first, so you need to let it know.
+				The other example from <xref linkend="sect-Musicians_Guide-SC-Basic_Programming-Busses" /> use the "before" Function to ensure that the "pink noise" and "sine wave" UGen's were calculated before the "reverberation" UGen.  Especially since these are all audio-rate UGen's, the server would not reasonably know which to calculate first, so you need to let it know.
 			</para>
 		</section>
 		<section id="sect-Musicians_Guide-C-Basic_Programming-Ordering_and_Other_Features-Changing_the_Order">
@@ -2779,13 +2783,13 @@
 		<section id="sect-Musicians_Guide-SC-Basic_Programming-Getting_Help-Email">
 			<title>Email</title>
 			<para>
-				If you feel comfortable sending an email to a mailing list, you can use the [http://www.beast.bham.ac.uk/research/sc_mailing_lists.shtml sc-users] list.  If you decide to subscribe to this list, be aware that it receives a large amount of mail every day.
+				If you feel comfortable sending an email to a mailing list, you can use the <citetitle>sc-users Mailing List</citetitle>, available at <ulink url="http://www.beast.bham.ac.uk/research/sc_mailing_lists.shtml" />.  If you decide to subscribe to this list, be aware that it receives a large amount of mail every day.
 			</para>
 		</section>
 		<section id="sect-Musicians_Guide-SC-Basic_Programming-Getting_Help-SuperCollider_Website">
 			<title>The SuperCollider Website</title>
 			<para>
-				The SuperCollider web site at SourceForge offers links to many resources.  It is available [http://supercollider.sourceforge.net/ here].
+				The <citetitle>SuperCollider Website</citetitle> at SourceForge (<ulink url="http://supercollider.sourceforge.net/" />) offers links to many resources.
 			</para>
 		</section>
 	</section> <!-- Ends "How to Get Help" Section --> <!--    SC-Basic_Programming-Getting_Help-    -->
@@ -2793,10 +2797,10 @@
 	<section id="sect-Musicians_Guide-SC-Basic_Programming-Legal_Attribution">
 		<title>Legal Attribution</title>
 		<para>
-			This portion of the Fedora Musicians' Guide, called "Basic Programming with SuperCollider," is a derivative work of the, "Getting Started With SuperCollider" tutorial.  The original work was created by Scott Wilson, James Harkins, and the SuperCollider development team.  It is available on the internet at [http://supercollider.svn.sourceforge.net/viewvc/supercollider/trunk/common/build/Help/Tutorials/Getting-Started/Getting%20Started%20With%20SC.html this location].
+			This portion of the Fedora Musicians' Guide, called "Basic Programming with SuperCollider," is a derivative work of the, <citetitle>Getting Started With SuperCollider</citetitle> tutorial.  The original work was created by Scott Wilson, James Harkins, and the SuperCollider development team.  It is available on the internet from <ulink url="http://supercollider.svn.sourceforge.net/viewvc/supercollider/trunk/common/build/Help/Tutorials/Getting-Started/Getting%20Started%20With%20SC.html" />.
 		</para>
 		<para>
-			The original document, like all SuperCollider documentation, is licenced under the Creative Commons' [http://creativecommons.org/licenses/by-sa/3.0/ Attribution Share-Alike 3.0 Unported] licence.
+			The original document, like all SuperCollider documentation, is licenced under the Creative Commons' <citetitle>Attribution Share-Alike 3.0 Unported Licence</citetitle>, accessible on the internet at <ulink url="http://creativecommons.org/licenses/by-sa/3.0/" />.
 		</para>
 		<para>
 			This usage should in no way be construed as an endorsement of the Fedora Project, the Musicians' Guide, or any other party by the SuperCollider development team.
diff --git a/en-US/SuperCollider/SuperCollider-Composing.xml b/en-US/SuperCollider/SuperCollider-Composing.xml
index 193d9f7..d1bd1c6 100644
--- a/en-US/SuperCollider/SuperCollider-Composing.xml
+++ b/en-US/SuperCollider/SuperCollider-Composing.xml
@@ -7,7 +7,7 @@
 <section id="sect-Musicians_Guide-SuperCollider-Composing">
 	<title>Composing with SuperCollider</title>
 	<para>
-		This section is an explanation of the creative thought-process that went into creating the SuperCollider composition that we've called "Method One," for which the source and exported audio files are available below in the !!L!!"Included Files"!!L!! section.
+		This section is an explanation of the creative thought-process that went into creating the SuperCollider composition that we've called "Method One," for which the source and exported audio files are available below.
 	</para>
 	<para>
 		It is our hope that, in illustrating how we developed this composition from a single SinOsc command, you will learn about SuperCollider and its abilities, about how to be creative with SuperCollider, and how a simple idea can turn into something of greater and greater complexity.
@@ -21,19 +21,19 @@
 		<para>
 			The following files represent complete versions of the program.  You should try to complete the program yourself before reviewing these versions:
 			<itemizedlist>
-			<listitem><para>[[User:Crantila/FSC/Synthesizers/SuperCollider/FSC_method_1.sc|Method One]]</para></listitem>
-			<listitem><para>[[User:Crantila/FSC/Synthesizers/SuperCollider/FSC_method_1-short.sc|Method One (Short)]]</para></listitem>
-			<listitem><para>[[Media:FMG-Method_One.flac|FLAC Recording of "Method One"]]</para></listitem>
+			<listitem><para>Original Version with Full Explanations: <ulink url="https://fedoraproject.org/wiki/User:Crantila/FSC/Synthesizers/SuperCollider/FSC_method_1.sc" /></para></listitem>
+			<listitem><para>Optimized Version with Fewer Explanations: <ulink url="https://fedoraproject.org/wiki/User:Crantila/FSC/Synthesizers/SuperCollider/FSC_method_1-short.sc" /></para></listitem>
+			<listitem><para>FLAC Recording: <ulink url="https://fedoraproject.org/wiki/Media:FMG-Method_One.flac" /></para></listitem>
 			</itemizedlist>
 		</para>
 		<para>
-			FSC_method_1.sc : This is an extensively-commented version of the source code.  The comments not only describe the way the code works, but pose some problems and questions that you may wish to work on, to increase your knowledge of SuperCollider.  The problem with the verbosity of the comments is that it can be difficult to read the code itself, as it would be written in a real program.
+			<literal>FSC_method_1.sc</literal> : This is an extensively-commented version of the source code.  The comments not only describe the way the code works, but pose some problems and questions that you may wish to work on, to increase your knowledge of SuperCollider.  The problem with the verbosity of the comments is that it can be difficult to read the code itself, as it would be written in a real program.
 		</para>
 		<para>
-			FSC_method_1-short.sc : This is a less-commented version of the source code.  I've also re-written part of the code, to make it more flexible for use in other programs.  The differences between this, and code that I would have written for myself only, are trivial.
+			<literal>FSC_method_1-short.sc</literal> : This is a less-commented version of the source code.  I've also re-written part of the code, to make it more flexible for use in other programs.  The differences between this, and code that I would have written for myself only, are trivial.
 		</para>
 		<para>
-			FSC_method_1.flac : This is a recording that I produced of the program, which I produced in Ardour.
+			<literal>FSC_method_1.flac</literal> : This is a recording that I produced of the program, which I produced in Ardour.
 		</para>
 	</section>
 	
diff --git a/en-US/SuperCollider/SuperCollider-Exporting.xml b/en-US/SuperCollider/SuperCollider-Exporting.xml
index 88a6327..868783d 100644
--- a/en-US/SuperCollider/SuperCollider-Exporting.xml
+++ b/en-US/SuperCollider/SuperCollider-Exporting.xml
@@ -13,26 +13,26 @@
 	<section id="sect-Musicians_Guide-SC-Non_Real_Time_Synthesis">
 		<title>Non-Real-Time Synthesis</title>
 		<para>
-			SuperCollider allows you to synthesze audio output to an audio file.  Doing this requires using OSC commands on the server, the "DiskOut" UGen, the "Buffer" UGen, and other relatively advanced concepts.  The built-in help file located [file:///usr/share/SuperCollider/Help/UGens/Playback%20and%20Recording/DiskOut.html here] contains some help with the DiskOut UGen, and links to other useful help files.  This method is not further discussed here.
+			SuperCollider allows you to synthesze audio output to an audio file.  Doing this requires using OSC commands on the server, the "DiskOut" UGen, the "Buffer" UGen, and other relatively advanced concepts.  The built-in <citetitle>DiskOut</citetitle> help file, available from  <ulink url="file:///usr/share/SuperCollider/Help/UGens/Playback%20and%20Recording/DiskOut.html" /> on Fedora Linux systems, contains some help with the DiskOut UGen, and links to other useful help files.  This method is not further discussed here.
 		</para>
 	</section>
 	
 	<section id="sect-Musicians_Guide-SC-Recording_SuperColliders_Output">
 		<title>Recording SuperCollider's Output (Tutorial)</title>
 		<para>
-			Since SuperCollider outputs its audio signals to the JACK sound server, any other JACK-aware program has the opportunity to record, process, and use them.  This portion of the tutorial will help you to record SuperCollider's output in Ardour.  Due to the advanced nature of SuperCollider, the text assumes that you have a basic knowledge of how to work with Ardour.  If not, you may find it helpful to refer to the [[User:Crantila/FSC/Recording/Ardour|Ardour chapter]] of the Musicians' Guide.
+			Since SuperCollider outputs its audio signals to the JACK sound server, any other JACK-aware program has the opportunity to record, process, and use them.  This portion of the tutorial will help you to record SuperCollider's output in Ardour.  Due to the advanced nature of SuperCollider, the text assumes that you have a basic knowledge of how to work with Ardour.  If not, you may find it helpful to refer to <xref linkend="chap-Musicians_Guide-Ardour" />.
 		</para>
 		<para>
 			This procedure will help you to use Ardour to record SuperCollider's output.
 		</para>
 		<procedure>
-			<step><para>Close unnecessary applications and stop unnecessary processes, which will help to reduce the risk of a buffer underrun, which would cause an audible break in audio.  If you are viewing this document in a web browser, you may want to copy-and-paste it into a simple text editor, or GEdit, if you are already using that.</para></step>
+			<step><para>Close unnecessary applications and stop unnecessary processes, which will help to reduce the risk of a buffer overrun or underrun, which cause an audible break in audio.  If you are viewing this document in a web browser, you may want to copy-and-paste it into a simple text editor, or GEdit, if you are already using that.</para></step>
 			<step><para>Use QjackCtl to set up JACK with the right audio interface and configuration options.</para></step>
 			<step><para>In order to get a clean start, restart the SuperCollider interpreter in GEdit, then start the server.</para></step>
 			<step><para>Open Ardour with a new session, and set up the rulers and timeline as desired.  Seconds is usually the most appropriate unit with which to measure a SuperCollider recording.</para></step>
 			<step><para>Add a stereo track (or however many channels desired), and rename it it "SuperCollider."</para></step>
 			<step><para>Use Ardour (the "Track/Bus Inspector" window) or QjackCtl to connect the "SuperCollider" track to SuperCollider's outputs.</para></step>
-			<step><para>You'll want to make sure that the SuperCollider output is also connected to your audio interface, so that you can hear the program as you progress.  This is an example of !!L!!multi-plexing!!L!!.   Changes to your audio interface's volume control will not affect the recording in Ardour.</para></step>
+			<step><para>You'll want to make sure that the SuperCollider output is also connected to your audio interface, so that you can hear the program as you progress.  This is an example of multi-plexing.   Changes to your audio interface's volume control will not affect the recording in Ardour.</para></step>
 			<step><para>Arm the track and transport in Ardour.  When you are ready, start the transport.  It is not important to start SuperCollider as quickly as possible, since you can cut out the silence after the recording is made.</para></step>
 			<step><para>Switch to GEdit and play the program that you want to record.  If you make a mistake while starting the program, that's okay.  We can always edit the recording after it's recorded.</para></step>
 			<step><para>Listen to the recording as it goes along.  Use QjackCtl to make sure that you don't encounter a buffer underrun, and Ardour to make sure that you don't record a distorted signal.</para></step>
diff --git a/en-US/SuperCollider/SuperCollider.xml b/en-US/SuperCollider/SuperCollider.xml
index 016dce7..0eb9905 100644
--- a/en-US/SuperCollider/SuperCollider.xml
+++ b/en-US/SuperCollider/SuperCollider.xml
@@ -35,15 +35,15 @@
 				<!-- Rarely is such a warning required, but I feel it is necessary for SuperCollider -->
 				SuperCollider is by far the most difficult program described in the Fedora Musicians' Guide.  The SuperCollider applications themselves are easy to use, and they work very well, but they are merely tools to help you accomplish something useful.  SuperCollider has an extremely powerful and flexible programming language, with libraries designed primarily for audio processing.  As often happens with computers, however, this added flexibility and power comes at the cost of requiring greater understanding and learning on the part of the user.
 
-				Because SuperCollider involves actual programming, a rudimentary understanding of some principles and concepts of computer science will provide huge benefits to somebody learning the language.  The following articles from a free encyclopaedia should not be considered mandatory reading, but you should refer to them as necessary while learning the language.
+				Because SuperCollider involves actual programming, a rudimentary understanding of some principles and concepts of computer science will provide huge benefits to somebody learning the language.  The following articles from Wikipedia are not mandatory reading, but you should refer to them as necessary while learning the language.
 				<itemizedlist>
-				<listitem><para>[http://en.wikipedia.org/wiki/Computer_programming Computer Programming]: You probably know what this is; it's what you'll be doing.</para></listitem>
-				<listitem><para>[http://en.wikipedia.org/wiki/Programming_language Programming Language]: SuperCollider is a programming language.</para></listitem>
-				<listitem><para>[http://en.wikipedia.org/wiki/Interpreter_%28computing%29 Interpreter]: This reads your code, and sends commands to the server, which cause it to produce sound.</para></listitem>
-				<listitem><para>[http://en.wikipedia.org/wiki/Server_%28computing%29 Server]: SuperCollider has a 'server' component, which is operated by the interpreter.</para></listitem>
-				<listitem><para>[http://en.wikipedia.org/wiki/Functional_programming Functional Programming]: SuperCollider can be treated as a "functional" language.</para></listitem>
-				<listitem><para>[http://en.wikipedia.org/wiki/Imperative_programming Imperative Programming]: SuperCollider can be treated as an "imperative" language.</para></listitem>
-				<listitem><para>[http://en.wikipedia.org/wiki/Object-oriented_programming Object-oriented Programming]: SuperCollider can be treated as an "object-oriented" language.</para></listitem>
+				<listitem><para><citetitle>Computer Programming</citetitle> at <ulink url="http://en.wikipedia.org/wiki/Computer_programming" />: You probably know what this is; it's what you'll be doing.</para></listitem>
+				<listitem><para><citetitle>Programming Language</citetitle> at <ulink url="http://en.wikipedia.org/wiki/Programming_language" />: SuperCollider is a programming language.</para></listitem>
+				<listitem><para><citetitle>Interpreter</citetitle> at <ulink url="http://en.wikipedia.org/wiki/Interpreter_%28computing%29" />: This reads your code, and sends commands to the server, which causes it to produce sound.</para></listitem>
+				<listitem><para><citetitle>Server</citetitle> at <ulink url="http://en.wikipedia.org/wiki/Server_%28computing%29" />: SuperCollider has a 'server' component, which is operated by the interpreter.</para></listitem>
+				<listitem><para><citetitle>Functional Programming</citetitle> at <ulink url="http://en.wikipedia.org/wiki/Functional_programming" />: SuperCollider can be treated as a "functional" language.</para></listitem>
+				<listitem><para><citetitle>Imperative Programming</citetitle> at <ulink url="http://en.wikipedia.org/wiki/Imperative_programming" />: SuperCollider can be treated as an "imperative" language.</para></listitem>
+				<listitem><para><citetitle>Object-Oriented Programming</citetitle> at <ulink url="http://en.wikipedia.org/wiki/Object-oriented_programming" />: SuperCollider can be treated as an "object-oriented" language.</para></listitem>
 				</itemizedlist>
 				<!-- Sourced from http://en.wikipedia.org/wiki/List_of_basic_computer_programming_topics -->
 			</para>
@@ -51,10 +51,10 @@
 		<section id="sect-Musicians_Guide-SC-Req_and_Inst-Software_Requirements">
 			<title>Software Requirements</title>
 			<para>
-				SuperCollider uses the JACK Audio Connection Kit.  You should install JACK before installing SuperCollider.  Follow the instructions !!L!! here !!L!! to install JACK.
+				SuperCollider uses the JACK Audio Connection Kit.  You should install JACK before installing SuperCollider.  Refer to <xref linkend="sect-Musicians_Guide-Install_and_Configure_JACK" /> for instructions to install JACK.
 			</para>
 			<para>
-				SuperCollider is not available from the Fedora software repositories.  You must enable the "Planet CCRMA at Home" repository to install SuperCollider.  See !!L!! this section !!L!! for instructions that enable the "Planet CCRMA at Home" repository.  The "Planet CCRMA at Home" repository contains a wide variety of music and audio applications.
+				SuperCollider is not available from the Fedora software repositories.  You must enable the "Planet CCRMA at Home" repository to install SuperCollider.  See <xref linkend="sect-Musicians_Guide-CCRMA_Installing_Repository" /> for instructions ti enable the "Planet CCRMA at Home" repository.  The "Planet CCRMA at Home" repository contains a wide variety of music and audio applications.
 			</para>
 		</section>
 		<section id="sect-Musicians_Guide-SC-Req_and_Inst-Hardware_Requirements">
@@ -153,7 +153,7 @@
 				These steps should be followed every time you open GEdit, and wish to use the SuperCollider extension.
 			</para>
 			<procedure>
-				<step><para># From the menu, select 'Tools > SuperCollider Mode'</para></step>
+				<step><para>Choose <menuchoice><guimenu>Tools</guimenu><guimenuitem>SuperCollider Mode</guimenuitem></menuchoice></para></step>
 				<step><para>A 'SuperCollider' menu should appear, and a window at the bottom which says, "SuperCollider output".</para></step>
 				<step><para>If you cannot see the window at the bottom, then select 'View > Bottom Pane' from the menu, so that it shows up.  It is sometimes important to see the information that SuperCollider provides in this window.</para></step>
 				<step><para>After enabling SuperCollider mode, the window should display a series of notices.  Near the end should be something like this:


More information about the docs-commits mailing list