General Question

chromaBYTE's avatar

Compress a download before downloading?

Asked by chromaBYTE (652points) October 1st, 2008

On our school network, we aren’t allowed to download EXE files or files over a certain size. However I’ve been needing them quite often at school because of my computer science project. I can get them downloaded if I go see the technicians, but they’re not always there and I have to answer a ton of questions about what I’m going to use the software for, and check that it’s not a virus, etc.

Is there any sites or something I can do so that the EXE can be placed inside a ZIP file before I download it? Are there any sites that can do this?

Observing members: 0 Composing members: 0

6 Answers

tWrex's avatar

No that is impossible. You can always try using something other than ftp to download. Like wget . Or you could setup a server of your own on your home network, use wget to download it to your box, use the command line to zip it back up and download it from your server.

blastfamy's avatar

you could rar it into multiple pieces, then re-assemble the pieces; similar to the way they do it for torrents…

richardhenry's avatar

I think the above boils down to “unfortunately, no, you’re going to have download it at home.”

chromaBYTE's avatar

hmm, if i was awesome at making servers and writing scripts, i could set up… something on my computer at home where i send it a link to a file through a web interface, then it splits it into multiple RARs then provides me the links to download them off my computer.

But I wouldn’t have any idea at all on how to go about this… haha.

tWrex's avatar

@chromaBYTE I think that idea is very risky in that you would have to give it access to your CLI which would open up all types of security holes for you. If this could be done with PHP/Python/Perl/Ruby alone, then yeah it’d be fine, but giving the web access to your cli is like saying, hey I don’t use a firewall and I run a smtp server. C’mon in!

mjoyce's avatar

Most web servers, like for example Apache will us the gzip compression algorithim to compress most files before sending it down to the browser. Firefox, IE, and Chrome this functionality right out of the box. That answers the question in your topic.

What you are actually looking for is a proxy. It is likely that your IT department is not doing active state filtering of the file as it is being transfered, but instead is looking at the attachment of the GET HTTP command and intercepting that through a web proxy.

A trivial way to solve this is to have your own proxy to bypass the restriction by removing the ”.exe” from the GET command that your browser is using.

As an example:
#!/usr/bin/perl

use File::Fetch;
use CGI;

my $url = param(‘url’);
write_page();

if defined($url){
fetch_file($url);
system(“mv $fn $fn.obscure”);
print “grab your file: <a href=”$fn”>$fn</a> <br />\n”;
}
sub fetch_file
{
my $url = shift;
my $f = File::Fetch->new(url => $url);
my $fn = $ff->fetch(to => ’./tmp’);

return $fn;
}

sub write_page
{
print header;
print ”<html>\n”;

print << EOF
<form action=$$ method=post>
<br>
File to Fetch: <input type=“text” name=“url” id=“url”>
<input type=submit value=“fetch”>
</form>
</html>
EOF
}

You should uh, test this before using it. Writing something in a comment box like that this doesn’t ensure the best quality :)

Answer this question

Login

or

Join

to answer.

This question is in the General Section. Responses must be helpful and on-topic.

Your answer will be saved while you login or join.

Have a question? Ask Fluther!

What do you know more about?
or
Knowledge Networking @ Fluther