From d9bae189983e5ae34c0d206cfe76e52cf4853101 Mon Sep 17 00:00:00 2001 From: Mark Robert Miller Date: Wed, 26 Aug 2009 19:48:00 +0000 Subject: [PATCH] missed these when deleting benchmark page from site git-svn-id: https://svn.apache.org/repos/asf/lucene/java/trunk@808165 13f79535-47bb-0310-9956-ffa450edef68 --- docs/benchmarks.html | 1174 ------------------------------------------ docs/benchmarks.pdf | 354 ------------- 2 files changed, 1528 deletions(-) delete mode 100644 docs/benchmarks.html delete mode 100644 docs/benchmarks.pdf diff --git a/docs/benchmarks.html b/docs/benchmarks.html deleted file mode 100644 index f935c050900..00000000000 --- a/docs/benchmarks.html +++ /dev/null @@ -1,1174 +0,0 @@ - - - - - - - -Apache Lucene - Resources - Performance Benchmarks - - - - - - - - - -
- - - -
- - - - - - - - - - - - -
-
-
-
- -
- - -
- -
- -   -
- - - - - -
- -

Apache Lucene - Resources - Performance Benchmarks

- - - - -

Performance Benchmarks

-
-

- The purpose of these user-submitted performance figures is to - give current and potential users of Lucene a sense - of how well Lucene scales. If the requirements for an upcoming - project is similar to an existing benchmark, you - will also have something to work with when designing the system - architecture for the application. -

-

- If you've conducted performance tests with Lucene, we'd - appreciate if you can submit these figures for display - on this page. Post these figures to the lucene-user mailing list - using this - template. -

-
- - - -

Benchmark Variables

-
-

- -

    - -

    - -Hardware Environment -
    - -

  • -Dedicated machine for indexing: Self-explanatory - (yes/no)
  • - -
  • -CPU: Self-explanatory (Type, Speed and Quantity)
  • - -
  • -RAM: Self-explanatory
  • - -
  • -Drive configuration: Self-explanatory (IDE, SCSI, - RAID-1, RAID-5)
  • - -

    - -

    - -Software environment -
    - -

  • -Lucene Version: Self-explanatory
  • - -
  • -Java Version: Version of Java SDK/JRE that is run -
  • - -
  • -Java VM: Server/client VM, Sun VM/JRockIt
  • - -
  • -OS Version: Self-explanatory
  • - -
  • -Location of index: Is the index stored in filesystem - or database? Is it on the same server(local) or - over the network?
  • - -

    - -

    - -Lucene indexing variables -
    - -

  • -Number of source documents: Number of documents being - indexed
  • - -
  • -Total filesize of source documents: - Self-explanatory
  • - -
  • -Average filesize of source documents: - Self-explanatory
  • - -
  • -Source documents storage location: Where are the - documents being indexed located? - Filesystem, DB, http, etc.
  • - -
  • -File type of source documents: Types of files being - indexed, e.g. HTML files, XML files, PDF files, etc.
  • - -
  • -Parser(s) used, if any: Parsers used for parsing the - various files for indexing, - e.g. XML parser, HTML parser, etc.
  • - -
  • -Analyzer(s) used: Type of Lucene analyzer used
  • - -
  • -Number of fields per document: Number of Fields each - Document contains
  • - -
  • -Type of fields: Type of each field
  • - -
  • -Index persistence: Where the index is stored, e.g. - FSDirectory, SqlDirectory, etc.
  • - -

    - -

    - -Figures -
    - -

  • -Time taken (in ms/s as an average of at least 3 indexing - runs): Time taken to index all files
  • - -
  • -Time taken / 1000 docs indexed: Time taken to index - 1000 files
  • - -
  • -Memory consumption: Self-explanatory
  • - -
  • -Query speed: average time a query takes, type - of queries (e.g. simple one-term query, phrase query), - not measuring any overhead outside Lucene
  • - -

    - -

    - -Notes -
    - -

  • -Notes: Any comments which don't belong in the above, - special tuning/strategies, etc.
  • - -

    - -
- -

-
- - - -

User-submitted Benchmarks

-
-

- These benchmarks have been kindly submitted by Lucene users for - reference purposes. -

-

-We make NO guarantees regarding their accuracy or - validity. - -

-

We strongly recommend you conduct your own - performance benchmarks before deciding on a particular - hardware/software setup (and hopefully submit - these figures to us). -

- -

Hamish Carpenter's benchmarks

-
    - -

    - -Hardware Environment -
    - -

  • -Dedicated machine for indexing: yes
  • - -
  • -CPU: Intel x86 P4 1.5Ghz
  • - -
  • -RAM: 512 DDR
  • - -
  • -Drive configuration: IDE 7200rpm Raid-1
  • - -

    - -

    - -Software environment -
    - -

  • -Lucene Version: 1.3
  • - -
  • -Java Version: 1.3.1 IBM JITC Enabled
  • - -
  • -Java VM:
  • - -
  • -OS Version: Debian Linux 2.4.18-686
  • - -
  • -Location of index: local
  • - -

    - -

    - -Lucene indexing variables -
    - -

  • -Number of source documents: Random generator. Set - to make 1M documents - in 2x500,000 batches.
  • - -
  • -Total filesize of source documents: > 1GB if - stored
  • - -
  • -Average filesize of source documents: 1KB
  • - -
  • -Source documents storage location: Filesystem
  • - -
  • -File type of source documents: Generated
  • - -
  • -Parser(s) used, if any:
  • - -
  • -Analyzer(s) used: Default
  • - -
  • -Number of fields per document: 11
  • - -
  • -Type of fields: 1 date, 1 id, 9 text
  • - -
  • -Index persistence: FSDirectory
  • - -

    - -

    - -Figures -
    - -

  • -Time taken (in ms/s as an average of at least 3 - indexing runs):
  • - -
  • -Time taken / 1000 docs indexed: 49 seconds
  • - -
  • -Memory consumption:
  • - -

    - -

    - -Notes -
    - -

    - A windows client ran a random document generator which - created - documents based on some arrays of values and an excerpt - (approx 1kb) - from a text file of the bible (King James version).
    - These were submitted via a socket connection (open throughout - indexing process).
    - The index writer was not closed between index calls.
    - This created a 400Mb index in 23 files (after - optimization).
    - -

    - -

    - -Query details:
    - -

    - -

    - Set up a threaded class to start x number of simultaneous - threads to - search the above created index. -

    - -

    - Query: +Domain:sos +(+((Name:goo*^2.0 Name:plan*^2.0) - (Teaser:goo* Tea - ser:plan*) (Details:goo* Details:plan*)) -Cancel:y) - +DisplayStartDate:[mkwsw2jk0 - -mq3dj1uq0] +EndDate:[mq3dj1uq0-ntlxuggw0] -

    - -

    - This query counted 34000 documents and I limited the returned - documents - to 5. -

    - -

    - This is using Peter Halacsy's IndexSearcherCache slightly - modified to - be a singleton returned cached searchers for a given - directory. This - solved an initial problem with too many files open and - running out of - linux handles for them. -

    - -
    -                                Threads|Avg Time per query (ms)
    -                                1       1009ms
    -                                2       2043ms
    -                                3       3087ms
    -                                4       4045ms
    -                                ..        .
    -                                ..        .
    -                                10      10091ms
    -                            
    - -

    - I removed the two date range terms from the query and it made - a HUGE - difference in performance. With 4 threads the avg time - dropped to 900ms! -

    - -

    Other query optimizations made little difference.

    - -

    - -
-

- Hamish can be contacted at hamish at catalyst.net.nz. -

- -

Justin Greene's benchmarks

-
    - -

    - -Hardware Environment -
    - -

  • -Dedicated machine for indexing: No, but nominal - usage at time of indexing.
  • - -
  • -CPU: Compaq Proliant 1850R/600 2 X pIII 600
  • - -
  • -RAM: 1GB, 256MB allocated to JVM.
  • - -
  • -Drive configuration: RAID 5 on Fibre Channel - Array
  • - -

    - -

    - -Software environment -
    - -

  • -Java Version: 1.3.1_06
  • - -
  • -Java VM:
  • - -
  • -OS Version: Winnt 4/Sp6
  • - -
  • -Location of index: local
  • - -

    - -

    - -Lucene indexing variables -
    - -

  • -Number of source documents: about 60K
  • - -
  • -Total filesize of source documents: 6.5GB
  • - -
  • -Average filesize of source documents: 100K - (6.5GB/60K documents)
  • - -
  • -Source documents storage location: filesystem on - NTFS
  • - -
  • -File type of source documents:
  • - -
  • -Parser(s) used, if any: Currently the only parser - used is the Quiotix html - parser.
  • - -
  • -Analyzer(s) used: SimpleAnalyzer
  • - -
  • -Number of fields per document: 8
  • - -
  • -Type of fields: All strings, and all are stored - and indexed.
  • - -
  • -Index persistence: FSDirectory
  • - -

    - -

    - -Figures -
    - -

  • -Time taken (in ms/s as an average of at least 3 - indexing runs): 1 hour 12 minutes, 1 hour 14 minutes and 1 hour 17 - minutes. Note that the # - and size of documents changes daily.
  • - -
  • -Time taken / 1000 docs indexed:
  • - -
  • -Memory consumption: JVM is given 256MB and uses it - all.
  • - -

    - -

    - -Notes -
    - -

    - We have 10 threads reading files from the filesystem and - parsing and - analyzing them and the pushing them onto a queue and a single - thread poping - them from the queue and indexing. Note that we are indexing - email messages - and are storing the entire plaintext in of the message in the - index. If the - message contains attachment and we do not have a filter for - the attachment - (ie. we do not do PDFs yet), we discard the data. -

    - -

    - -
-

- Justin can be contacted at tvxh-lw4x at spamex.com. -

- -

Daniel Armbrust's benchmarks

-

- My disclaimer is that this is a very poor "Benchmark". It was not done for raw speed, - nor was the total index built in one shot. The index was created on several different - machines (all with these specs, or very similar), with each machine indexing batches of 500,000 to - 1 million documents per batch. Each of these small indexes was then moved to a - much larger drive, where they were all merged together into a big index. - This process was done manually, over the course of several months, as the sources became available. -

-
    - -

    - -Hardware Environment -
    - -

  • -Dedicated machine for indexing: no - The machine had moderate to low load. However, the indexing process was built single - threaded, so it only took advantage of 1 of the processors. It usually got 100% of this processor.
  • - -
  • -CPU: Sun Ultra 80 4 x 64 bit processors
  • - -
  • -RAM: 4 GB Memory
  • - -
  • -Drive configuration: Ultra-SCSI Wide 10000 RPM 36GB Drive
  • - -

    - -

    - -Software environment -
    - -

  • -Lucene Version: 1.2
  • - -
  • -Java Version: 1.3.1
  • - -
  • -Java VM:
  • - -
  • -OS Version: Sun 5.8 (64 bit)
  • - -
  • -Location of index: local
  • - -

    - -

    - -Lucene indexing variables -
    - -

  • -Number of source documents: 13,820,517
  • - -
  • -Total filesize of source documents: 87.3 GB
  • - -
  • -Average filesize of source documents: 6.3 KB
  • - -
  • -Source documents storage location: Filesystem
  • - -
  • -File type of source documents: XML
  • - -
  • -Parser(s) used, if any:
  • - -
  • -Analyzer(s) used: A home grown analyzer that simply removes stopwords.
  • - -
  • -Number of fields per document: 1 - 31
  • - -
  • -Type of fields: All text, though 2 of them are dates (20001205) that we filter on
  • - -
  • -Index persistence: FSDirectory
  • - -
  • -Index size: 12.5 GB
  • - -

    - -

    - -Figures -
    - -

  • -Time taken (in ms/s as an average of at least 3 - indexing runs): For 617271 documents, 209698 seconds (or ~2.5 days)
  • - -
  • -Time taken / 1000 docs indexed: 340 Seconds
  • - -
  • -Memory consumption: (java executed with) java -Xmx1000m -Xss8192k so - 1 GB of memory was allotted to the indexer
  • - -

    - -

    - -Notes -
    - -

    - The source documents were XML. The "indexer" opened each document one at a time, ran an - XSL transformation on them, and then proceeded to index the stream. The indexer optimized - the index every 50,000 documents (on this run) though previously, we optimized every - 300,000 documents. The performance didn't change much either way. We did no other - tuning (RAM Directories, separate process to pretransform the source material, etc.) - to make it index faster. When all of these individual indexes were built, they were - merged together into the main index. That process usually took ~ a day. -

    - -

    - -
-

- Daniel can be contacted at Armbrust.Daniel at mayo.edu. -

- -

Geoffrey Peddle's benchmarks

-

- I'm doing a technical evaluation of search engines - for Ariba, an enterprise application software company. - I compared Lucene to a commercial C language based - search engine which I'll refer to as vendor A. - Overall Lucene's performance was similar to vendor A - and met our application's requirements. I've - summarized our results below. -

-

- Search scalability:
- We ran a set of 16 queries in a single thread for 20 - iterations. We report below the times for the last 15 - iterations (ie after the system was warmed up). The - 4 sets of results below are for indexes with between - 50,000 documents to 600,000 documents. Although the - times for Lucene grew faster with document count than - vendor A they were comparable. -

-
-50K  documents
-Lucene   5.2   seconds
-A        7.2
-200K
-Lucene   15.3
-A        15.2
-400K
-Lucene    28.2
-A         25.5
-600K
-Lucene    41
-A         33
-
-

- Individual Query times:
- Total query times are very similar between the 2 - systems but there were larger differences when you - looked at individual queries. -

-

- For simple queries with small result sets Vendor A was - consistently faster than Lucene. For example a - single query might take vendor A 32 thousands of a - second and Lucene 64 thousands of a second. Both - times are however well within acceptable response - times for our application. -

-

- For simple queries with large result sets Vendor A was - consistently slower than Lucene. For example a - single query might take vendor A 300 thousands of a - second and Lucene 200 thousands of a second. - For more complex queries of the form (term1 or term2 - or term3) AND (term4 or term5 or term6) AND (term7 or - term8) the results were more divergent. For - queries with small result sets Vendor A generally had - very short response times and sometimes Lucene had - significantly larger response times. For example - Vendor A might take 16 thousands of a second and - Lucene might take 156. I do not consider it to be - the case that Lucene's response time grew unexpectedly - but rather that Vendor A appeared to be taking - advantage of an optimization which Lucene didn't have. - (I believe there's been discussions on the dev - mailing list on complex queries of this sort.) -

-

- Index Size:
- For our test data the size of both indexes grew - linearly with the number of documents. Note that - these sizes are compact sizes, not maximum size during - index loading. The numbers below are from running du - -k in the directory containing the index data. The - larger number's below for Vendor A may be because it - supports additional functionality not available in - Lucene. I think it's the constant rate of growth - rather than the absolute amount which is more - important. -

-
-50K  documents
-Lucene      45516 K
-A           63921
-200K
-Lucene      171565
-A           228370
-400K
-Lucene      345717
-A           457843
-600K
-Lucene      511338
-A           684913
-
-

- Indexing Times:
- These times are for reading the documents from our - database, processing them, inserting them into the - document search product and index compacting. Our - data has a large number of fields/attributes. For - this test I restricted Lucene to 24 attributes to - reduce the number of files created. Doing this I was - able to specify a merge width for Lucene of 60. I - found in general that Lucene indexing performance to - be very sensitive to changes in the merge width. - Note also that our application does a full compaction - after inserting every 20,000 documents. These times - are just within our acceptable limits but we are - interested in alternatives to increase Lucene's - performance in this area. -

-

- -

-600K documents
-Lucene       81 minutes
-A            34 minutes
-
- -

-

- (I don't have accurate results for all sizes on this - measure but believe that the indexing time for both - solutions grew essentially linearly with size. The - time to compact the index generally grew with index - size but it's a small percent of overall time at these - sizes.) -

-
    - -

    - -Hardware Environment -
    - -

  • -Dedicated machine for indexing: yes
  • - -
  • -CPU: Dell Pentium 4 CPU 2.00Ghz, 1cpu
  • - -
  • -RAM: 1 GB Memory
  • - -
  • -Drive configuration: Fujitsu MAM3367MP SCSI
  • - -

    - -

    - -Software environment -
    - -

  • -Java Version: 1.4.2_02
  • - -
  • -Java VM: JDK
  • - -
  • -OS Version: Windows XP
  • - -
  • -Location of index: local
  • - -

    - -

    - -Lucene indexing variables -
    - -

  • -Number of source documents: 600,000
  • - -
  • -Total filesize of source documents: from database
  • - -
  • -Average filesize of source documents: from database
  • - -
  • -Source documents storage location: from database
  • - -
  • -File type of source documents: XML
  • - -
  • -Parser(s) used, if any:
  • - -
  • -Analyzer(s) used: small variation on WhitespaceAnalyzer
  • - -
  • -Number of fields per document: 24
  • - -
  • -Type of fields: A1 keyword, 1 big unindexed, rest are unstored and a mix of tokenized/untokenized
  • - -
  • -Index persistence: FSDirectory
  • - -
  • -Index size: 12.5 GB
  • - -

    - -

    - -Figures -
    - -

  • -Time taken (in ms/s as an average of at least 3 - indexing runs): 600,000 documents in 81 minutes (du -k = 511338)
  • - -
  • -Time taken / 1000 docs indexed: 123 documents/second
  • - -
  • -Memory consumption: -ms256m -mx512m -Xss4m -XX:MaxPermSize=512M
  • - -

    - -

    - -Notes -
    - -

    - -

  • merge width of 60
  • - -
  • did a compact every 20,000 documents
  • - -

    - -

    - -
-
- - -
- -
 
-
- - - diff --git a/docs/benchmarks.pdf b/docs/benchmarks.pdf deleted file mode 100644 index 0956e0ac288..00000000000 --- a/docs/benchmarks.pdf +++ /dev/null @@ -1,354 +0,0 @@ -%PDF-1.3 -%ª«¬­ -4 0 obj -<< /Type /Info -/Producer (FOP 0.20.5) >> -endobj -5 0 obj -<< /Length 698 /Filter [ /ASCII85Decode /FlateDecode ] - >> -stream -Gaua=966RV&:j6I$JjtgXQ[dd(oML;Bb+oMHV&OF6,ZHB\,t+#^TRbuP!,AOV)B;YddCj7m:rItls6_B+:KG*B-&h#`'`J9_Z8p;K?3]\$pj06+>1Ljid)u8XaCYF,RF(qi_,WZb%WNrmt5HA:*m2+)Z:!%]+b(/>5S%lA[O!VPrM67Z:_R%:qrI4rm_nh9EqteCiR`C%bY7UZRfGRb*n/U2bIqQ\Ze]bfH4/aCRZ-[!t^WjVhFJC1VbN/WBoG=TJW&%g7;hW%quPs/?-@:".(q@<;X+$DXX,=qLF^k,5mZ\ZD)cTih6?0MsHo)91M2c2OVY,)VR&$<0H%X#Q0hfI.YC.J,I,Ls*a2J(aR1;-eg9pi>po8(Y7@bcb25RcPoD*fC1P5cfM]\-K>)khG$4%]+JZ(7@?/:aFSJpGJo6dlV+&c0cE*hXPL]<5k'E]!:A.]H)(\VV>IMM?)6=l_O\99ZO1e)ZPe#>sTgYaj=FgaKiS(LT08%b-`g_FAdO:DC*KCe+J*elem2X+e3QBj+acj%Z#()+I-50dmhD[DN,NjirXm<;=k?rEMDhXgkbo5MD=hm22\.3WR=12H[-2u?,dH[?~> -endstream -endobj -6 0 obj -<< /Type /Page -/Parent 1 0 R -/MediaBox [ 0 0 612 792 ] -/Resources 3 0 R -/Contents 5 0 R -/Annots 7 0 R ->> -endobj -7 0 obj -[ -8 0 R -10 0 R -12 0 R -14 0 R -16 0 R -18 0 R -20 0 R -] -endobj -8 0 obj -<< /Type /Annot -/Subtype /Link -/Rect [ 102.0 519.166 234.296 507.166 ] -/C [ 0 0 0 ] -/Border [ 0 0 0 ] -/A 9 0 R -/H /I ->> -endobj -10 0 obj -<< /Type /Annot -/Subtype /Link -/Rect [ 102.0 500.966 214.304 488.966 ] -/C [ 0 0 0 ] -/Border [ 0 0 0 ] -/A 11 0 R -/H /I ->> -endobj -12 0 obj -<< /Type /Annot -/Subtype /Link -/Rect [ 102.0 482.766 246.98 470.766 ] -/C [ 0 0 0 ] -/Border [ 0 0 0 ] -/A 13 0 R -/H /I ->> -endobj -14 0 obj -<< /Type /Annot -/Subtype /Link -/Rect [ 108.0 464.566 278.96 452.566 ] -/C [ 0 0 0 ] -/Border [ 0 0 0 ] -/A 15 0 R -/H /I ->> -endobj -16 0 obj -<< /Type /Annot -/Subtype /Link -/Rect [ 108.0 446.366 256.964 434.366 ] -/C [ 0 0 0 ] -/Border [ 0 0 0 ] -/A 17 0 R -/H /I ->> -endobj -18 0 obj -<< /Type /Annot -/Subtype /Link -/Rect [ 108.0 428.166 272.3 416.166 ] -/C [ 0 0 0 ] -/Border [ 0 0 0 ] -/A 19 0 R -/H /I ->> -endobj -20 0 obj -<< /Type /Annot -/Subtype /Link -/Rect [ 108.0 409.966 270.284 397.966 ] -/C [ 0 0 0 ] -/Border [ 0 0 0 ] -/A 21 0 R -/H /I ->> -endobj -22 0 obj -<< /Length 2048 /Filter [ /ASCII85Decode /FlateDecode ] - >> -stream -Gatm<>BcSq&:WMD_9#@0F)6R-jY9>jZoPgql*c"RqOe!DW!5X9iUuorG?A5jZ&m\=O,n-'Jb72GdX3is^"p?WDZ8MF,Yhh'Zb$1Nj)k;'4,GO^k19CeVt\ahZ&*=OJId8/TR4Cm!oeI=-(20qr4I2.3rj+Tk'X;gmjc&7WVmY.`US[,6d2653#U@M>j=asW0%m>7%2!OlqeCj.OK(5];rHm/44J0)7/LO/")he3-s.bHHu=,qCHK[A@8!iBT6(+)?r2R#D,VgJIHb,Z%K'BNV^@`=cHIXe"^CF)ARrc9T9-)bOl0A8kZ%CZbJ:FSJm[ih/^UeYR,ja0!Y5.s!AHqYOa(#K-VT5@!j3K'J8c3AQj$^SACLsEJ3jFD46Wc<-(p#74b-,_>t&<5-X_8f^5EYg@W,jP@b0j&&.Tson7e?0KM?d(m"8\dA&IoT#oqQSM\N'\sZ2[HhA-0)#'&%5abXJ%n"_[,sNClri-!Q]Y+!Q^-\Kd%Mr_+0#_G3&A>IMrZ"G%c-'68Jd%:8)3F"tH*KjoK3OSGA70S2\g`$m2!DeGA\_V!9Os[+5X-9$)*'Z[#O4=RB#5J"#BlJ4B!@aYU)\b8>)cpe)HhQD2YZP7S3m"eppB1kT0XQ&SWD6QD2S(;og.X*5!/6gnOsP>*[JU9&W[7,60BPcHW?C7_kgPSZURJP,W6M7rs.4_<^G_MEHcJa@tni6fA4>.h6TCg4e=:>euRH]=#B"C5bG/$$J*eS!WQB:d$R_eGW:[/r*_W=2%,J7NAn7jek#:n8Y[`^h3p3R+NOZh,OD?Ag^\<$W@!'`+ngpfZp5k?(-ObB&^PEmLR'nr1i"\i81*W0.RVVdh@to[f6@q=U_'\E'6g7j=OkK?[EoW`%C-@r2WLt%-*A*4/iV.W6>0o6EL/AKdX[KQ7ZG<:'7[uOC()`Tqbr6$&60E:Zl1G)BNVfi&`D@TZ-S(?[F.afJDjHNL5g#enGZ6BJ%idBQ8(6!J/<0;jI#$ceS2j#DCGYA^:%m*=^R^GX.@0RnRse1/NK[f'59P?B5$d33IsJVI;;sVKhERhd;lt'YYk!$M(hrh%ds\`+p,I.@O0laS1Pca-\e(:Zp0cPc7"MWL9i&=JUDeu`XHo2SE)$:SAE6*`mcNKGX5=B!lX-l+pX2XSYIF6]68[d6:/;-drfXI.o -endstream -endobj -23 0 obj -<< /Type /Page -/Parent 1 0 R -/MediaBox [ 0 0 612 792 ] -/Resources 3 0 R -/Contents 22 0 R ->> -endobj -24 0 obj -<< /Length 2407 /Filter [ /ASCII85Decode /FlateDecode ] - >> -stream -Gau`U=``=U&:XAWi1$:*K].@ObtdS`gJ43Tdlu90J4N5JR"l*5P3lU`:m?WO.8X_*8&o,/"GshdgVraKp?\.#Z\L?g>,^>fB;oB&HY.Nh9O?:X^mIgENNYbi7k6s=gt7#pP!Mjp'^2O[j?.dQIk`=jA1*`4jW_OT/V(-`iEoY13p(Srip?j2DFomI&NZ/-YIQ3A5otjDuq$,R7\XH'K%eJ8,!bF30O+?k/.H'pm(4-*&fJDM&?9;2Ts"892Whiug9!+S"f?Dh+"[&;$H07L=r&q*QMT03)C;(^&r:&ARI%Q%.d24[b*!hK21ZEjho*[Sbi$[H"s1T13^9<*HPp-q*t)PAtZ*k^<=:,a=ZA%-%W\!@d8&qYHoG;IDPC$"=5S$B4q[j:"^KZIj5KSk@_=jjD"7I(lUl4#.m56E[%FJe!F0'I"U(T'S+4".\/n,c4n[]+>6*o9Q0`dVYSEQ32`%QrfYR@e=cC],+k5>9jmopGHR_R3N1e7?!_AO[B'MFGMF0O(d1*BU;9(h/EjD&NVQc#CN&n=guX"F'_A^eGEX.bG&:nh.$2&Sp(g,#@k8M8.J+XX&)'2;O-th,-#j3gFb5H:*>nO!\4,&+4VITAd.)0`TqciY.[=*P&=m5e7'8_ntit"ddoAMm?4akfKXKHSgqbge@pnHpoRk[W=jd;0b9=P-p,0-8_in.$"qKWC1!8:amL\`Va.:D'P8bi;$e.[g=&31Z*DlR'D7G_H-s.D%ZI(Bb],@h$FX(kilp]R`sEY^lQTr7WgQF+>`ppA&A8.);+V0Vpt3^*X:.pX\?JqDNK)llZ)f"nAc4$K1ng9lApeumdUB'Qg]AE1CL'/L=;3XT%Us+a&M*`XU/kk[Q&$bC$DHP=#H,@,co_G#J[1Z_>tm1:sDq%D5RMu4%h[_N`qhs3Ve(=)(29]SgjgrjXL"=gGi]!7#b;J@]nXQim5-nR?288;3%mCK[#J6@aHlbrW\GQh[*@Z@\+N.k,DJnkJ!rCk9gH)3LWniFkASs@g9jj7dRnV]1%@C_K:/0UdH5f,9_Z\.C#rKj,.J$cS7CMh^hpf3KN:u%t7SkS'qDuU_VP\Fs:]u2Ll9nj_GUMA@+l/!@P6)V6tOhc##rq`,S3+,-T9X[j>6g%DK_>R.3N42>*abNOJ@ZOGbIORYD;[bX5[^LYddh8C#fu^J#>PNa2o1rh/+T>0$hA1_k\sjr3:W1/0FM11#+IWSYPU]ci$"s>_4KGWT@7hYDe*Q7'h&7H_9h5f4k\7FBBfX_MDl5-h4*hc.1(]s*fqTt-ElroX-V7,@lIM\A\U$FshAZl54>ON'1b:!ZHEmb>A62c!TH'!DYO597$"r?d.Y+S)/h=#@R@nZDSrqTPpe@_M/u4++cHS6QTjntnJ=MKlH9sh]i`"-'&gn?)Z@>$?oJ?J"t5]_RD(DT!:722R%;P@aH.U&>hZ@=%`%48 -endstream -endobj -25 0 obj -<< /Type /Page -/Parent 1 0 R -/MediaBox [ 0 0 612 792 ] -/Resources 3 0 R -/Contents 24 0 R ->> -endobj -26 0 obj -<< /Length 1370 /Filter [ /ASCII85Decode /FlateDecode ] - >> -stream -Gat%#9loAB&A?Dnd*:FI?!/;DO50(I:2Q.d2S4$gF'p7M,*BeHO=-N)+7s+_)/X-32(jIDVVg_--[7Tl^-<\<5+phMIQng0.#`a7:!N]kN6grH`aKcV(7YlL*?Mq@q;!8&&2nHeoc]d*,4gSAfoi-Sb*1c9"i&rXkbgAZ50M3/gId\ta*OlSeB^M(H=k2(+Xj`c@!ll7e%!37%WMV1fm6%(+t<'Y_)RL^#])[bqBJ!!\3Z>rpa`)@s#;/>b.R_uRllNngg:JS+;Dl,+)mec-sf6PHR(9*]4#.]VU6%=qn?P>nGYO2R?DfAj;h1VGjuYqa_H':Qt"W=O&gEp-$WK!Ba9gKb1TQL@&]@d`EJ/ENl1pFZ6-XkP?ubLl/EGcVuc^a[?\=u>"a;_1fGNi8!m?5E"IY)#b\nefiQ0lW>P\S$[aZ5/E6j5HO7R(<\hQHgW@g"rM>Nh0+Sp1o9G*"[rACr<3D&s&Fc7CPSp7sJA+)/3OLqNeV0:$rp7m?a.,h0H)3fH3=WEn';dRF6>qOb(u00(P!]SW)eZL4Ub+u1omRJMQ-Jb`q*!AB?'MJN4(@otXr5Jt^)E4eWu@l7W;.G816o8o"kD)Q;@N`E_,9oJVRNO4O.Na0?ArY7URq#MAlSDRuS"A#([/7Cs#Vnr?*12uO6!Z(4UK1p^%_=QRjiD9_sLN]"-T%hEV!=Y8_5f_,mR>hM'ONOP),Rc?#)eB<)7jApZZ1ViEkESFn^!d2b7=!mD'A"7epH%O-ng[%^4.k8s(J=JG1p7Y(B+=ZXfPEqnb^?8=o.'.I.s+Nas>m`ZWOoa_&WKIp13HN-TBhCkr?L-$AgKm4p`(GQ4i?Y'3/("i:UOmI2P$L@T.\g@>)s!YN>MbhO".U5ja/&u2pT\/]LqQD?bJomQH_D'_>CR62(n-B'%1YNGr[J:+:f-1=H7A5->='&-k6hAAif>hTehP&6&I54c2i?aHKU5N11bR2S83Iabm.<3?T&piJ$!!;?,=P(e]`[Hi)M^+pLZ0a$Gi@H+"jl5gSka$(qE\I`L2IR,]_@i!tMJ$&b#NH"+i\Z`A#`jg4%2An$2&P!!14?GXFZD5lOV)InVp`h,Z$YD^KJ1QUQ>Abpk1^hcdF4IGgH[mm,U9Kl6b[["K)!+m13JH~> -endstream -endobj -27 0 obj -<< /Type /Page -/Parent 1 0 R -/MediaBox [ 0 0 612 792 ] -/Resources 3 0 R -/Contents 26 0 R ->> -endobj -29 0 obj -<< - /Title (\376\377\0\61\0\40\0\120\0\145\0\162\0\146\0\157\0\162\0\155\0\141\0\156\0\143\0\145\0\40\0\102\0\145\0\156\0\143\0\150\0\155\0\141\0\162\0\153\0\163) - /Parent 28 0 R - /Next 30 0 R - /A 9 0 R ->> endobj -30 0 obj -<< - /Title (\376\377\0\62\0\40\0\102\0\145\0\156\0\143\0\150\0\155\0\141\0\162\0\153\0\40\0\126\0\141\0\162\0\151\0\141\0\142\0\154\0\145\0\163) - /Parent 28 0 R - /Prev 29 0 R - /Next 31 0 R - /A 11 0 R ->> endobj -31 0 obj -<< - /Title (\376\377\0\63\0\40\0\125\0\163\0\145\0\162\0\55\0\163\0\165\0\142\0\155\0\151\0\164\0\164\0\145\0\144\0\40\0\102\0\145\0\156\0\143\0\150\0\155\0\141\0\162\0\153\0\163) - /Parent 28 0 R - /First 32 0 R - /Last 35 0 R - /Prev 30 0 R - /Count -4 - /A 13 0 R ->> endobj -32 0 obj -<< - /Title (\376\377\0\63\0\56\0\61\0\40\0\110\0\141\0\155\0\151\0\163\0\150\0\40\0\103\0\141\0\162\0\160\0\145\0\156\0\164\0\145\0\162\0\47\0\163\0\40\0\142\0\145\0\156\0\143\0\150\0\155\0\141\0\162\0\153\0\163) - /Parent 31 0 R - /Next 33 0 R - /A 15 0 R ->> endobj -33 0 obj -<< - /Title (\376\377\0\63\0\56\0\62\0\40\0\112\0\165\0\163\0\164\0\151\0\156\0\40\0\107\0\162\0\145\0\145\0\156\0\145\0\47\0\163\0\40\0\142\0\145\0\156\0\143\0\150\0\155\0\141\0\162\0\153\0\163) - /Parent 31 0 R - /Prev 32 0 R - /Next 34 0 R - /A 17 0 R ->> endobj -34 0 obj -<< - /Title (\376\377\0\63\0\56\0\63\0\40\0\104\0\141\0\156\0\151\0\145\0\154\0\40\0\101\0\162\0\155\0\142\0\162\0\165\0\163\0\164\0\47\0\163\0\40\0\142\0\145\0\156\0\143\0\150\0\155\0\141\0\162\0\153\0\163) - /Parent 31 0 R - /Prev 33 0 R - /Next 35 0 R - /A 19 0 R ->> endobj -35 0 obj -<< - /Title (\376\377\0\63\0\56\0\64\0\40\0\107\0\145\0\157\0\146\0\146\0\162\0\145\0\171\0\40\0\120\0\145\0\144\0\144\0\154\0\145\0\47\0\163\0\40\0\142\0\145\0\156\0\143\0\150\0\155\0\141\0\162\0\153\0\163) - /Parent 31 0 R - /Prev 34 0 R - /A 21 0 R ->> endobj -36 0 obj -<< /Type /Font -/Subtype /Type1 -/Name /F3 -/BaseFont /Helvetica-Bold -/Encoding /WinAnsiEncoding >> -endobj -37 0 obj -<< /Type /Font -/Subtype /Type1 -/Name /F5 -/BaseFont /Times-Roman -/Encoding /WinAnsiEncoding >> -endobj -38 0 obj -<< /Type /Font -/Subtype /Type1 -/Name /F1 -/BaseFont /Helvetica -/Encoding /WinAnsiEncoding >> -endobj -39 0 obj -<< /Type /Font -/Subtype /Type1 -/Name /F2 -/BaseFont /Helvetica-Oblique -/Encoding /WinAnsiEncoding >> -endobj -40 0 obj -<< /Type /Font -/Subtype /Type1 -/Name /F7 -/BaseFont /Times-Bold -/Encoding /WinAnsiEncoding >> -endobj -1 0 obj -<< /Type /Pages -/Count 4 -/Kids [6 0 R 23 0 R 25 0 R 27 0 R ] >> -endobj -2 0 obj -<< /Type /Catalog -/Pages 1 0 R - /Outlines 28 0 R - /PageMode /UseOutlines - >> -endobj -3 0 obj -<< -/Font << /F3 36 0 R /F5 37 0 R /F1 38 0 R /F2 39 0 R /F7 40 0 R >> -/ProcSet [ /PDF /ImageC /Text ] >> -endobj -9 0 obj -<< -/S /GoTo -/D [23 0 R /XYZ 85.0 659.0 null] ->> -endobj -11 0 obj -<< -/S /GoTo -/D [23 0 R /XYZ 85.0 519.466 null] ->> -endobj -13 0 obj -<< -/S /GoTo -/D [23 0 R /XYZ 85.0 480.332 null] ->> -endobj -15 0 obj -<< -/S /GoTo -/D [23 0 R /XYZ 85.0 372.398 null] ->> -endobj -17 0 obj -<< -/S /GoTo -/D [23 0 R /XYZ 85.0 321.145 null] ->> -endobj -19 0 obj -<< -/S /GoTo -/D [23 0 R /XYZ 85.0 269.892 null] ->> -endobj -21 0 obj -<< -/S /GoTo -/D [25 0 R /XYZ 85.0 659.0 null] ->> -endobj -28 0 obj -<< - /First 29 0 R - /Last 31 0 R ->> endobj -xref -0 41 -0000000000 65535 f -0000010795 00000 n -0000010874 00000 n -0000010966 00000 n -0000000015 00000 n -0000000071 00000 n -0000000860 00000 n -0000000980 00000 n -0000001047 00000 n -0000011089 00000 n -0000001182 00000 n -0000011152 00000 n -0000001319 00000 n -0000011218 00000 n -0000001455 00000 n -0000011284 00000 n -0000001591 00000 n -0000011350 00000 n -0000001728 00000 n -0000011416 00000 n -0000001863 00000 n -0000011482 00000 n -0000002000 00000 n -0000004141 00000 n -0000004249 00000 n -0000006749 00000 n -0000006857 00000 n -0000008320 00000 n -0000011546 00000 n -0000008428 00000 n -0000008650 00000 n -0000008869 00000 n -0000009149 00000 n -0000009422 00000 n -0000009691 00000 n -0000009972 00000 n -0000010239 00000 n -0000010352 00000 n -0000010462 00000 n -0000010570 00000 n -0000010686 00000 n -trailer -<< -/Size 41 -/Root 2 0 R -/Info 4 0 R ->> -startxref -11597 -%%EOF