Firmware: A-410 [01 Apr. 2014] | A-400 [12 Mar. 2014] | C-300 [13 Feb. 2014] | A-300 [24 Feb. 2014] | C-200 [11 July 2013] | A-200/A-210 [11 July 2013] | Popbox V8 [3 Dec 2013]

Just got your NMT | WIKI has the answers | Search the forum | Forum Rules/Policy | Firmware & Official NMT News | Popcornhour manuals



User(s) browsing this thread: 1 Guest(s)
Thread Closed 
[USENET] What's wrong with my .conf?
03-19-2009, 04:01 AM
Post: #1
[USENET] What's wrong with my .conf?
I can't get nzbget to work, when I upload a .nzb in the gui nothing happens. Here's my nzbget.conf

Code:
# Sample configuration file for nzbget
#
# On POSIX put this file to one of the following locations:
# ~/.nzbget
# /etc/nzbget.conf
# /usr/etc/nzbget.conf
# /usr/local/etc/nzbget.conf
# /opt/etc/nzbget.conf
#
# On Windows put this file in program's directory.
#
# You can also put the file into any location, if you specify the path to it
# using switch "-c", e.g:
#   nzbget -c /home/user/myconig.txt

# For quick start change the option MAINDIR and configure one news-server


##############################################################################
### PATHS                                                                  ###

# Root directory for all related tasks
# MAINDIR is a variable and therefore starts with "$"
# The value "~/download" is example for POSIX. On Windows use
# absolute paths, like "C:\Download".
$MAINDIR=/share/.nzbget

# Destination-directory to store the downloaded files
DestDir=/share/Downloads

# Directory to monitor for incoming nzb-jobs.
# Can have subdirectories (only one level of nesting).
# A nzb-file queued from a subdirectory will be automatically assigned to
# category with the directory-name.
NzbDir=/share/Downloads/NBZs

# Directory to store download queue
QueueDir=${MAINDIR}/queue

# Directory to store temporary files
TempDir=${MAINDIR}/tmp

# Lock-file for daemon-mode, contains process-id (PID) (POSIX only)
LockFile=${MAINDIR}/tmp/nzbget.lock

# Where to store log file, if it needs to be created (see "CreateLog")
LogFile=${destdir}/nzbget.log


##############################################################################
### NEWS-SERVERS                                                           ###

# This section defines which servers nzbget should connect to.
# The servers will be ordered by their level, i.e. nzbget will at
# first try to download an article from the level-0-server.
# If that server fails, nzbget proceeds with the level-1-server, etc.
# A good idea is surely to put your major download-server at level 0
# and your fill-servers at levels 1,2,...
# NOTE: Do not leave out a level in your server-list and start with level 0!
# NOTE: Several servers with the same level may be used, they will have
# the same priority.

# First server, on level 0
# Level of newsserver
Server1.Level=0
# Host-name of newsserver
server1.host=ssl-eu.astraweb.com
# Port to connect to (default 119 if not specified)
Server1.Port=563
# Username to use for authentication
Server1.Username=SECRET
# Password to use for authentication
Server1.Password=SECRET
# Server requires "Join Group"-command (yes, no) (default yes if not specified)
Server1.JoinGroup=yes
# Encrypted server connection (TLS/SSL) (yes, no)
Server1.Encryption=yes
# Maximal number of simultaneous connections to this server
Server1.Connections=20

# Second server, on level 0
#Server2.Level=0
#Server2.Host=my2.newsserver.com
#Server2.Port=119
#Server2.Username=me
#Server2.Password=mypass
#Server2.JoinGroup=yes
#Server2.Connections=4

# Third server, on level 1
#Server3.Level=1
#Server3.Host=fills.newsserver.com
#Server3.Port=119
#Server3.Username=me2
#Server3.Password=mypass2
#Server3.JoinGroup=yes
#Server3.Connections=1


##############################################################################
### PERMISSIONS (POSIX ONLY)                                               ###

# User name for daemon-mode (POSIX in daemon-mode only).
# Set the user that the daemon normally runs at.
# Set $MAINDIR with an absolute path to be sure where it will write.
# This allows nzbget daemon to be launched in rc.local (at boot), and
# download items as a specific user id.
# NOTE: This option has effect only if the program was started from
# root-account, otherwise it is ignored and the daemon runs under
# current user id
DaemonUserName=nmt

# Specify default umask (affects file permissions) for newly created
# files (POSIX only).
# The value should be written in octal form (the same as for "umask" shell
# command). If umask not specified (or a value greater than 0777 used, useful
# to disable current config-setting via command-line parameter) the umask-mode
# will not be set and current umask-mode (set via shell) will be used
# NOTE: do not forget to uncomment the next line
UMask=000


##############################################################################
### DOWNLOAD QUEUE                                                         ###

# Save download queue to disk. This allows to reload it on next start (yes, no)
SaveQueue=yes

# Reload download queue on start, if it exists (yes, no)
ReloadQueue=yes

# Reuse articles saved in temp-directory from previous program start (yes, no)
# This allows to continue download of file, if program was exited before
# the file was completed.
ContinuePartial=yes

# Create subdirectory with category-name in destination-directory (yes, no)
AppendCategoryDir=yes

# Create subdirectory with nzb-filename in destination-directory (yes, no)
AppendNzbDir=yes

# How often incoming-directory (option "NzbDir") must be checked for new
# nzb-files, in seconds.
# Value "0" disables the check.
NzbDirInterval=5

# How old nzb-file should at least be for it to be loaded to queue, in seconds.
# Nzbget checks if nzb-file was not modified in last few seconds, defined by
# this option. That safety interval prevents the loading of files, which
# were not yet completely saved to disk, for example if they are still being
# downloaded in web-browser.
NzbDirFileAge=60

# Check for duplicate files (yes, no)
# If this option is enabled the program checks by adding of a new nzb-file:
# 1) if nzb-file contains duplicate entries. This check aims on detecting
#    of reposted files (if first file was not fully uploaded);    
#    If the program find two files with identical names, only the
#    biggest of these files will be added to queue;
# 2) if download queue already contains file with the same name;
# 3) if destination file on disk already exists.
# In last two cases: if the file exists it will not be added to queue;
# If this option is disabled, all files are downloaded and duplicate files
# are renamed to "filename_duplicate1".
# Existing files are never deleted or overwritten.
DupeCheck=yes

# Visibly rename broken files on download appending "_broken" (yes, no)
# Do not activate this option if par-check is enabled.
RenameBroken=no

# Decode articles (yes, no)
# yes - decode articles using internal decoder (supports yEnc and UU formats).
# no - the articles will not be decoded and joined. External programs
#      (like "uudeview") can be used to decode and join downloaded articles.
#      Also useful for debugging to look at article's source text.
Decode=yes

# Write decoded articles directly into destination output file (yes, no)
# With this option enabled the program at first creates the output
# destination file with required size (total size of all articles),
# then writes on the fly decoded articles directly to the file
# without creating of any temporary files, even for decoded articles.
# This may results in major performance improvement, but this higly
# depends on OS and filesystem used.
# Can improve performance on a very fast internet connections,
# but you need to test if it works in your case.
# INFO: Tests showed, that on Linux with EXT3-partition activating of
# this option results in up to 20% better performance, but on Windows with NTFS
# or Linux with FAT32-partitions the performance were decreased.
# The possible reason is that on EXT3-partition Linux can create large files
# very fast (if the content of file does not need to be initialized),
# but Windows on NTFS-partition and also Linux on FAT32-partition need to
# initialize created large file with nulls, resulting in a big performace
# degradation.
# NOTE: for testing try to download few big files (with total size 500-1000MB)
# and measure required time. Do not rely on the program's speed indicator.
# NOTE: if both options "DirectWrite" and "ContinuePartial" are enabled,
# the program will create empty articles-files in temp-directrory. They
# are used to continue download of file on a next program start. To minimize
# disk-io it is recommended to disable option "ContinuePartial", if
# "DirectWrite" is enabled. Especially on a fast connections (where you
# would want to activate "DirectWrite") it should not be a problem to
# redownload the interrupted file.
DirectWrite=yes

# Check CRC of downloaded and decoded articles (yes, no)
# Normally this option should be enabled for better detecting of download
# errors. However checking of CRC needs about the same CPU time as
# decoding of articles. On a fast connections with slow CPUs disabling of
# CPU-check may slightly improve performance (if CPU is a limiting factor).
CrcCheck=yes

# How much retries should be attempted if a download error occurs
Retries=4

# Set the interval between retries, in seconds
RetryInterval=10

# Redownload article if CRC-check fails (yes, no)
# Helps to minimize number of broken files, but may be effective
# only if you have multiple download servers (even from the same provider
# but from different locations (e.g. europe, usa)).
# In any case the option increases your traffic.
# For slow connections loading of extra par-blocks may be more effective
# The option "CrcCheck" must be enabled for option "RetryOnCrcError" to work.
RetryOnCrcError=yes

# Set connection timeout, in seconds
ConnectionTimeout=60

# Timeout until a download-thread is killed, in seconds
# This can help on hanging downloads, but is dangerous.
# Do not use small values!
TerminateTimeout=600

# Set the (approximate) maximum number of allowed threads.
# Sometimes under certain circumstances the program may create way to many
# download threads. Most of them are in wait-state. That is not bad,
# but threads are usually a limited resource. If a program creates to many
# of them, operating system may kill it. The option <ThreadLimit> prevents that.
# NOTE 1: the number of threads is not the same as the number of connections
# opened to NNTP-servers. Do not use the option <ThreadLimit> to limit the
# number of connections. Use the appropriate options <ServerX.Connections>
# instead.
# NOTE 2: the actual number of created threads can be slightly larger as
# defined by the option. Important threads may be created even if the
# number of threads is exceeded. The option prevents only the creation of
# additional download threads.
# NOTE 3: in most cases you should leave the default value "100" unchanged.
# However you may increase that value if you need more than 90 connections
# (that's very unlikely) or decrease the value if the OS does not allow so
# many threads. But the most OSes should not have problems with 100 threads.
ThreadLimit=10

# Set the maximum download rate in KB/s, "0" means no speed control
DownloadRate=0

# Set the size of memory buffer used by writing the articles, in Bytes.
# Bigger values decrease disk-io, but increase memory usage.
# Value "0" causes the OS-dependend default value to be used.
# With value "-1" (which means "max/auto") the program sets the size of
# buffer according to the size of current article (typically less than 500K).
# NOTE: the value must be written in bytes, do not use postfixes "K" or "M".
# NOTE: to calculate the memory usage multiply WriteBufferSize by max number
# of connections, configured in section "NEWS-SERVERS".
# NOTE: typical article's size not exceed 500000 bytes, so using bigger values
# (like several megabytes) will just waste memory.
# NOTE: for desktop computers with large amount of memory value "-1" (max/auto)
# is recommended, but for computers with very low memory (routers, NAS)
# value "0" (default OS-dependend size) could be better alternative.
# NOTE: write-buffer is managed by OS (system libraries) and therefore
# the effect of the option is highly OS-dependend.
WriteBufferSize=0

# Pause if disk space gets below this value, in MegaBytes.
# Value "0" disables the check.
# Only the disk space on the drive with "DestDir" is checked.
# The drive with "TempDir" is not checked.
DiskSpace=250


##############################################################################
### LOGGING                                                                ###

# Create log file (yes, no)
CreateLog=no

# Delete log file upon server start (only in server-mode) (yes, no)
ResetLog=no

# How various messages must be printed (screen, log, both, none)
# Debug-messages can be printed only if the programm was compiled in
# debug-mode: "./configure --enable-debug"
ErrorTarget=screen
WarningTarget=screen
InfoTarget=screen
DetailTarget=screen
DebugTarget=none

# Number of messages stored in buffer and available for remote clients
LogBufferSize=1000

# Create a log of all broken files (yes ,no)
# It is a text file placed near downloaded files, which contains
# the names of broken files
CreateBrokenLog=yes

# Create memory dump (core-file) on abnormal termination (POSIX only) (yes, no)
# Core-files are very helpful for debugging.
# NOTE: core-files may contain sensible data, like your login/password to
# newsserver etc.
DumpCore=no

# See also option "logfile" in secion "PATHS"


##############################################################################
### DISPLAY                                                                ###

# Set screen-outputmode (loggable, colored, curses)
# loggable - only messages will be printed to standard output;
# colored  - prints messages (with simple coloring for messages categories)
#            and download progress info; uses escape-sequenses to move cursor;
# curses   - advanced interactive iterface with the ability to edit
#            download queue and variaous output options;
OutputMode=loggable

# Shows NZB-Filename in file list in curses-outputmode (yes, no)
# This option controls the initial state of curses-frontend,
# it can be switched on/off in run-time with Z-key
CursesNzbName=yes

# Show files in groups (NZB-files) in queue list in curses-outputmode (yes, no)
# This option controls the initial state of curses-frontend,
# it can be switched on/off in run-time with G-key
CursesGroup=no

# Show timestamps in message list in curses-outputmode (yes, no)
# This option controls the initial state of curses-frontend,
# it can be switched on/off in run-time with T-key
CursesTime=no

# Update interval for Frontend-output in MSec (min value 25)
# Bigger values reduce CPU usage (especially in curses-outputmode)
# and network traffic in remote-client mode
UpdateInterval=200


##############################################################################
### CLIENT/SERVER COMMUNICATION                                            ###

# Set the IP on which the server listen and which client uses to contact
# the server. It could be dns-hostname or ip-address (more effective since
# does not require dns-lookup).
# If you want the server to listen to all interfaces, use "0.0.0.0"
ServerIp=127.0.0.1

# Set the port which the server & client use
ServerPort=6789

# Set the password needed to succesfully queue a request
ServerPassword=tegbzn6789

# See also option "logbuffersize" in section "LOGGING"


##############################################################################
### PAR CHECK/REPAIR AND POSTPROCESSING                                    ###

# Reload Post-processor-queue on start, if it exists (yes, no)
# For this option to work the options "SaveQueue" and "ReloadQueue" must
# be also enabled.
ReloadPostQueue=yes

# How many par2-files to load (none, all, one)
# none - all par2-files must be automatically paused
# all - all par2-files must be downloaded
# one - only one main par2-file must be dowloaded and other must be paused
# Paused files remain in queue and can be unpaused by parchecker when needed
LoadPars=one

# Automatic par-verification (yes, no)
# To download only needed par2-files (smart par-files loading) set also
# the option "loadpars" to "one". If option "loadpars" is set to "all",
# all par2-files will be downloaded before verification and repair starts.
# The option "renamebroken" must be set to "no", otherwise the par-checker
# may not find renamed files and fail
ParCheck=yes

# Automatic par-repair (yes, no)
# If option "parcheck" is enabled and "parrepair" is not, the program
# only verifies downloaded files and downloads needed par2-files, but does
# not start repair-process. This is useful if the server does not have
# enough CPU power, since repairing of large files may take too much
# resources and time on a slow computers.
# This option has effect only if the option "parcheck" is enabled
ParRepair=yes

# Use only par2-files with matching names (yes, no)
# If par-check needs extra par-blocks it searches for par2-files
# in download queue, which can be unpaused and used for restore.
# These par2-files should have the same base name as the main par2-file,
# currently loaded in par-checker. Sometimes extra par files (especially if
# they were uploaded by a different poster) have not matching names.
# Normally par-checker does not use these files, but you can allow it
# to use these files by setting "strictparname" to "no".
# This has however a side effect: if NZB-file contains more than one collection
# of files (with different par-sets), par-checker may download par-files from
# a wrong collection. This increases you traffic (but not harm par-check).
# NOTE: par-checker always uses only par-files added from the same NZB-file
# and the option "strictparname" does not change this behavior
StrictParName=yes

# Pause download queue during check/repair (yes, no)
# Enable the option to give CPU more time for par-check/repair. That helps
# to speed up check/repair on slow CPUs with fast connection (e.g. NAS-devices).
# NOTE: if parchecker needs additional par-files it temporary unpauses queue
# NOTE: See also option <PostPauseQueue>.
ParPauseQueue=no

# Cleanup download queue after successful check/repair (yes, no)
# Enable this option for automatic deletion of unneeded (paused) par-files
# from download queue after successful check/repair.
# NOTE: before cleaning up the program checks if all paused files are par-files.
# If there are paused non-par-files (this means that you have paused them
# manually), the cleanup will be skipped for this collection.
ParCleanupQueue=yes

# Delete source nzb-file after successful check/repair (yes, no)
# Enable this option for automatic deletion of nzb-file from incoming directory
# after successful check/repair.
NzbCleanupDisk=yes

# Set path to program, that must be executed after the download of nzb-file
# or one collection in nzb-file (if par-check enabled and nzb-file contains
# multiple collections; see note below for the definition of "collection")
# is completed and possibly par-checked/repaired.
# Arguments passed to that program:
#  1 - path to destination dir, where downloaded files are located;
#  2 - name of nzb-file processed;
#  3 - name of par-file processed (if par-checked) or empty string (if not);
#  4 - result of par-check:
#      0 - not checked: par-check disabled or nzb-file does not contain any
#          par-files;
#      1 - checked and failed to repair;
#      2 - checked and sucessfully repaired;
#      3 - checked and can be repaired but repair is disabled;
#  5 - state of nzb-job:
#      0 - there are more collections in this nzb-file queued;
#      1 - this was the last collection in nzb-file;
#  6 - indication of failed par-jobs for current nzb-file:
#      0 - no failed par-jobs;
#      1 - current par-job or any of the previous par-jobs for the
#          same nzb-files failed;
#  7 - category assigned to nzb-file (can be empty string).
#
# NOTE: The parameter "state of nzb-job" is very important and MUST be checked
# even in the simplest scripts.
# If par-check is enabled and nzb-file contains more than one collection
# of files the postprocess-program is called after each collection is completed
# and par-checked. If you want to unpack files or clean up the directory
# (delete par-files, etc.) there are two possibilities, when you can do this:
#  1) you parse the "name of par-file processed" to find out the base name
#     of collection and clean up only files from this collection (not reliable,
#     because par-files sometimes have different names than rar-files);
#  2) or you just check the parameters "state of nzb-job" and "indication of
#     failed par-jobs" and do the processing, only if they are set to "1"
#     (which means, that this was the last collection in nzb-file and all files
#      are now completed) and to "0" (no failed par-jobs) respectively;
# NOTE 2: if the option "ParCheck" is disabled nzbget calls PostProcess
# only once, not after every collection, because the detection of collection
# is disabled in this case;
# NOTE 3: the term "collection" in the above description actually means
# "par-set". To determine what "collections" are present in nzb-file nzbget
# looks for par-sets. If any collection of files within nzb-file does
# not have any par-files, this collection will not be detected.
# For example, for nzb-file containing three collections but only two par-sets,
# the postprocess will be called two times - after processing of each par-set.
# NOTE 4: an example script for unrarring is provided within distribution
# in file <postprocess-example.sh>
# NOTE 5: do not forget to uncomment the next line
PostProcess=/share/.nzbget/unpak.sh

# Allow multiple post-processing for the same nzb-file (yes,no)
# After the post-processing (par-check and call of a postprocess-script) is
# completed, nzbget adds the nzb-file to a list of completed-jobs. The nzb-file
# stays in the list until the last file from that nzb-file is deleted from
# the download queue (it occurs straight away if the par-check was successful
# and the option "ParCleanupQueue" is enabled).
# That means, if a paused file from a nzb-collection becomes unpaused
# (manually or from a post-process-script) after the collection was allready
# postprocessed nzbget will not post-process nzb-file again.
# This prevents the unwanted multiple post-processings of the same nzb-file.
# But it might be needed if the par-check/-repair are performed not directly
# by nzbget but from a post-process-script.
# NOTE 1: enable this option only if you were advised to do that by the author
# of the post-process-script
# NOTE 2: by enabling "AllowReProcess" you should disable the option "ParCheck"
# to prevent multiple par-checking
AllowReProcess=no

# Set the default message-kind for output received from postprocess-script
# (None, Detail, Info, Warning, Error, Debug).
# NZBGet checks if the line written by the script to stdout or stderr starts
# with special character-sequence, determining the message-kind, e.g.:
# [INFO] bla-bla
# [DETAIL] bla-bla
# [WARNING] bla-bla
# [ERROR] bla-bla
# [DEBUG] bla-bla
# If the message-kind was detected the text is added to log with detected type.
# Otherwise the message becomes the default kind, specified in this option.
PostLogKind=Detail

# Pause download queue during executing of postprocess-script (yes, no)
# Enable the option to give CPU more time for postprocess-script. That helps
# to speed up postprocess on slow CPUs with fast connection (e.g. NAS-devices).
# NOTE: See also option <ParPauseQueue>.
PostPauseQueue=no

##############################################################################
### PERFORMANCE                                                            ###

# On a very fast connection and slow CPU and/or drive the following
# settings may improve performance:
# 1) Disable par-checking and -repairing ("ParCheck=no"). VERY important,
#    because par-checking/repairing needs a lot of CPU-power and
#    significantly increases disk usage;
# 2) Try to activate option "DirectWrite" ("DirectWrite=yes");
# 3) Disable option "CrcCheck" ("CrcCheck=no");
# 4) Disable option "ContinuePartial" ("ContinuePartial=no");
# 5) Do not limit download rate ("DownloadRate=0"), because the bandwidth
#    throttling eats CPU time;
# 6) Disable logging for info- and debug-messages ("InfoTarget=none",
#    "DebugTarget=none");
# 7) Run the program in daemon (Posix) or service (Windows) mode and use
#    remote client for short periods of time needed for controlling of
#    download process on server. Daemon/Service mode eats less CPU
#    resources due to not updating of output on screen.
# 8) Increase the value of option "WriteBufferSize" or better set it to
#    "-1" (max/auto) if you have spare 5-20 MB of memory.

Popcorn Hour A110 with 1TB Internal + 500GB USB on a Panasonic Viera TH-42PX80E
Find all posts by this user
03-19-2009, 07:27 AM
Post: #2
RE: [USENET] What's wrong with my .conf?
Port 563 is correct for secure NNTP usage.

The only other thing I could spot was
Code:
NzbDir=/share/Downloads/NBZs

which should probably be /NZBs unless you're sure that folder exists.
Find all posts by this user
03-19-2009, 01:22 PM
Post: #3
RE: [USENET] What's wrong with my .conf?
Got it to work now, don't know how, just restarted the popcorn a few times. My problem now is that when I'm downloading something I can't refresh http://192.168.2.2:8066/ it's to slow.

Popcorn Hour A110 with 1TB Internal + 500GB USB on a Panasonic Viera TH-42PX80E
Find all posts by this user
03-19-2009, 09:56 PM (This post was last modified: 03-19-2009 09:57 PM by stejan.)
Post: #4
RE: [USENET] What's wrong with my .conf?
(03-19-2009 01:22 PM)Bushdoctor Wrote:  Got it to work now, don't know how, just restarted the popcorn a few times. My problem now is that when I'm downloading something I can't refresh http://192.168.2.2:8066/ it's to slow.

Server connections parameter is really too high for the PCH which has a slow CPU, especially with SSL encryption.
Try to use
Server1.Connections=4
instead and see what happens but refresh shouldn't take more than 25-30 sec then. (almost immediate without SSL activated Wink)
For me, using 4 or 8 connections didin't change anything to download speed (about 750Kbytes/s)
Find all posts by this user
Thread Closed 


Possibly Related Threads...
Thread: Author Replies: Views: Last Post
  Popcorn Hour A-300 slow because of usenet MickCJ 5 300 03-31-2014 03:49 AM
Last Post: Willem53
  Usenet installation rutts 4 317 03-13-2014 09:37 PM
Last Post: rutts
Photo A-400 Usenet Download/unrar/repair speeds? beatuptruck 6 1,175 12-13-2013 08:27 PM
Last Post: guerilla
  Usenet on A400 for dummies? zedekias 8 1,578 08-05-2013 03:59 PM
Last Post: chris57
Exclamation Usenet DMCA troubleshooting(notice and take Down) hekc0211 0 1,920 04-16-2013 01:09 PM
Last Post: hekc0211
  CSI, usenet and torrents.... mrjaffa 1 1,197 04-01-2013 08:10 PM
Last Post: mrjaffa
  Unable to connect when using Usenet? Snorky 6 1,260 01-04-2013 05:17 PM
Last Post: Snorky
  GEt the most out of my usenet speed Goldrolly 2 1,370 11-06-2012 06:38 PM
Last Post: accident
  New to C-300 Usenet nino 4 2,417 06-23-2012 01:24 PM
Last Post: jhmiller
  NMTDVR rc1: Automated Tv and Movie Downloads via Usenet and Torrents journey4712 148 80,302 05-09-2012 06:33 PM
Last Post: Adelscott

Forum Jump: