久久久久久久av_日韩在线中文_看一级毛片视频_日本精品二区_成人深夜福利视频_武道仙尊动漫在线观看

使用python通過FTP下載大文件

Download big files via FTP with python(使用python通過FTP下載大文件)
本文介紹了使用python通過FTP下載大文件的處理方法,對大家解決問題具有一定的參考價值,需要的朋友們下面隨著小編來一起學習吧!

問題描述

我嘗試每天從我的服務器下載一個備份文件到我的本地存儲服務器,但我遇到了一些問題.

Im trying to download daily a backup file from my server to my local storage server, but i got some problems.

我寫了這段代碼(去掉了無用的部分,作為電子郵件功能):

I wrote this code (removed the useless parts, as the email function):

import os
from time import strftime
from ftplib import FTP
import smtplib
from email.MIMEMultipart import MIMEMultipart
from email.MIMEBase import MIMEBase
from email.MIMEText import MIMEText
from email import Encoders

day = strftime("%d")
today = strftime("%d-%m-%Y")

link = FTP(ftphost)
link.login(passwd = ftp_pass, user = ftp_user)
link.cwd(file_path)
link.retrbinary('RETR ' + file_name, open('/var/backups/backup-%s.tgz' % today, 'wb').write)
link.delete(file_name) #delete the file from online server
link.close()
mail(user_mail, "Download database %s" % today, "Database sucessfully downloaded: %s" % file_name)
exit()

我使用 crontab 運行它,例如:

And i run this with a crontab like:

40    23    *    *    *    python /usr/bin/backup-transfer.py >> /var/log/backup-transfer.log 2>&1

它適用于小文件,但它會凍結備份文件(大約 1.7Gb),下載的文件大約 1.2Gb 然后永遠不會增長(我等了大約一天),并且日志文件是空的.

It works with small files, but with the backups files (about 1.7Gb) it freeze, the downloaded file get about 1.2Gb then never grows up (i waited about a day), and the log file is empty.

有什么想法嗎?

ps:我使用的是 Python 2.6.5

p.s: im using Python 2.6.5

推薦答案

對不起,如果我回答了我自己的問題,但我找到了解決方案.

Sorry if i answer my own question, but I found the solution.

我嘗試了 ftputil 沒有成功,所以我嘗試了很多方法,最后,這行得通:

I tryed ftputil with no success, so i tryed many way and finally, this works:

def ftp_connect(path):
    link = FTP(host = 'example.com', timeout = 5) #Keep low timeout
    link.login(passwd = 'ftppass', user = 'ftpuser')
    debug("%s - Connected to FTP" % strftime("%d-%m-%Y %H.%M"))
    link.cwd(path)
    return link

downloaded = open('/local/path/to/file.tgz', 'wb')

def debug(txt):
    print txt

link = ftp_connect(path)
file_size = link.size(filename)

max_attempts = 5 #I dont want death loops.

while file_size != downloaded.tell():
    try:
        debug("%s while > try, run retrbinary
" % strftime("%d-%m-%Y %H.%M"))
        if downloaded.tell() != 0:
            link.retrbinary('RETR ' + filename, downloaded.write, downloaded.tell())
        else:
            link.retrbinary('RETR ' + filename, downloaded.write)
    except Exception as myerror:
        if max_attempts != 0:
            debug("%s while > except, something going wrong: %s
 	file lenght is: %i > %i
" %
                (strftime("%d-%m-%Y %H.%M"), myerror, file_size, downloaded.tell())
            )
            link = ftp_connect(path)
            max_attempts -= 1
        else:
            break
debug("Done with file, attempt to download m5dsum")
[...]

在我的日志文件中我發現:

In my log file i found:

01-12-2011 23.30 - Connected to FTP
01-12-2011 23.30 while > try, run retrbinary
02-12-2011 00.31 while > except, something going wrong: timed out
    file lenght is: 1754695793 > 1754695793
02-12-2011 00.31 - Connected to FTP
Done with file, attempt to download m5dsum

遺憾的是,即使文件已完全下載,我也必須重新連接到 FTP,這在我的 cas 中不是問題,因為我也必須下載 md5sum.

Sadly, i have to reconnect to FTP even if the file has been fully downloaded, that in my cas is not a problem, becose i have to download the md5sum too.

如您所見,我無法檢測到超時并重試連接,但是當我超時時,我只是重新連接;如果有人知道如何在不創建新的 ftplib.FTP 實例的情況下重新連接,請告訴我 ;)

As you can see, I'm not been able to detect the timeout and retry the connection, but when i got timeout, I simply reconnect again; If someone know how to reconnect without creating a new ftplib.FTP instance, let me know ;)

這篇關于使用python通過FTP下載大文件的文章就介紹到這了,希望我們推薦的答案對大家有所幫助,也希望大家多多支持html5模板網!

【網站聲明】本站部分內容來源于互聯網,旨在幫助大家更快的解決問題,如果有圖片或者內容侵犯了您的權益,請聯系我們刪除處理,感謝您的支持!

相關文檔推薦

Why I cannot make an insert to Python list?(為什么我不能插入 Python 列表?)
Insert a column at the beginning (leftmost end) of a DataFrame(在 DataFrame 的開頭(最左端)插入一列)
Python psycopg2 not inserting into postgresql table(Python psycopg2 沒有插入到 postgresql 表中)
list extend() to index, inserting list elements not only to the end(list extend() 索引,不僅將列表元素插入到末尾)
How to add element in Python to the end of list using list.insert?(如何使用 list.insert 將 Python 中的元素添加到列表末尾?)
TypeError: #39;float#39; object is not subscriptable(TypeError:“浮動對象不可下標)
主站蜘蛛池模板: 黑人精品欧美一区二区蜜桃 | 欧美精品一区二区三区蜜桃视频 | 亚洲精品黄色 | 日韩成人免费av | 亚洲精品欧美精品 | 狠狠色综合久久丁香婷婷 | 欧美久久久久 | 精品在线播放 | 亚洲国产精品99久久久久久久久 | 伊人久久免费视频 | 亚洲九色| 精品欧美一区二区精品久久 | 亚洲精品国产a久久久久久 中文字幕一区二区三区四区五区 | 欧美另类视频 | 欧美jizzhd精品欧美巨大免费 | 午夜影院 | 日本不卡一区二区三区 | 国产精品伦理一区 | 久久久激情| 欧美一区免费 | 99视频| 在线观看中文字幕av | 99久久99久久精品国产片果冰 | 日韩精品一区二区三区四区视频 | 视频一二区 | 国产精品久久国产精品 | 国产在线精品一区二区 | 中文字幕电影在线观看 | 日韩视频专区 | 国产专区在线 | 在线视频亚洲 | 亚洲免费av一区 | 亚洲一区二区av | 精区3d动漫一品二品精区 | 亚洲精品成人在线 | 国产一区久久精品 | 久操av在线 | 天天躁日日躁狠狠的躁天龙影院 | 久久久久国产一区二区三区四区 | 亚洲不卡在线观看 | 18av在线播放|