我是秃鹰的新手。我正在用no做秃鹰的主项目。覆盆子皮。目前,我已经将两个pi连接在一个秃鹰池中;我指的是"dag.html“网站来运行一个作业。但问题是作业是在提交机器上运行的。
有人能帮我一下吗。这将是一个很大的帮助。
提交文件为
Universe = vanilla
Executable = simple
should_transfer_files = yes
when_to_transfer_output = on_exit
MultiCPUJob = True
transfer_input_files = simple.dag
Arguments = 4 10
Log = simple.log
Output = simple.out
Error = simple.error
Queue
日志文件
pi@raspberrypi:~/job $ cat simple.log
000 (012.000.000) 02/25 06:00:21 Job submitted from host: <10.0.101.122:46766>
DAG Node: simple
...
001 (012.000.000) 02/25 06:00:31 Job executing on host: <10.0.101.122:36154>
...
006 (012.000.000) 02/25 06:00:35 Image size of job updated: 7
0 - MemoryUsage of job (MB)
0 - ResidentSetSize of job (KB)
...
005 (012.000.000) 02/25 06:00:36 Job terminated.
(1) Normal termination (return value 0)
Usr 0 00:00:00, Sys 0 00:00:00 - Run Remote Usage
Usr 0 00:00:00, Sys 0 00:00:00 - Run Local Usage
Usr 0 00:00:00, Sys 0 00:00:00 - Total Remote Usage
Usr 0 00:00:00, Sys 0 00:00:00 - Total Local Usage
56 - Run Bytes Sent By Job
6230 - Run Bytes Received By Job
56 - Total Bytes Sent By Job
6230 - Total Bytes Received By Job
Partitionable Resources : Usage Request Allocated
Cpus : 1 1
Disk (KB) : 14 10 76532
Memory (MB) : 0 1 434
...
pi@raspberrypi:~/job $
发布于 2016-05-16 06:35:04
如果作业在提交机器上运行,它必须运行STARTD以及SCHEDD、NEG等.如果您不需要这样做,那么您可以在/etc/condor/condor_config (或condor_config.local,取决于安装程序)中关闭它。
如果希望提交节点能够运行某些作业,但不运行这些作业,则可以指定
requirements = TARGET.Machine==foo@bar.com
为了强制它在特定的机器上运行,或者使用关于您的需求的任何特定的内容来筛选出提交节点,使用ClassAds作为开始节点(使用condor_status -l来查看这些节点)。您可以在配置文件中自己指定这些ClassAds,然后使用它们来设置作业的需求。启动节点: condor_config.local POOL=start_pool
SubmitNode: submit.dag requirements=POOL=="start_pool“给自己更好的定制。
https://stackoverflow.com/questions/35619756
复制相似问题