How do I download a Wikispaces page?
I usually download web pages with HTTrack when I know that I'll need that page, and I'll have no internet connection. That's what I wanted to do with Wikispaces page too.
However, HTTrack just can't download it:
Warning: File has moved from elte-ik-linalg.wikispaces.com/ to
https://session.wikispaces.com/1/auth/auth?authToken=0c4659f405444bb5ffd1e1284246a1a87
Info: No data seems to have been transferred during this session! : restoring previous one!
How do I download a Wikispaces page, then?
download webpage httrack
add a comment |
I usually download web pages with HTTrack when I know that I'll need that page, and I'll have no internet connection. That's what I wanted to do with Wikispaces page too.
However, HTTrack just can't download it:
Warning: File has moved from elte-ik-linalg.wikispaces.com/ to
https://session.wikispaces.com/1/auth/auth?authToken=0c4659f405444bb5ffd1e1284246a1a87
Info: No data seems to have been transferred during this session! : restoring previous one!
How do I download a Wikispaces page, then?
download webpage httrack
2
You could trywget
. It has a recursive function..
– sinni800
Jan 19 '12 at 15:10
@sinni800 - Gives an error, cannot download the page. If anyone manages to configure wget properly, then go ahead, I'll accept that as an answer too. Of course.
– Shiki
Jan 19 '12 at 15:32
What's your depth on HTTrack? You can also add in a rule to exclude session.wikispaces.com and it will avoid that domain regardless of depth.
– skub
Jan 19 '12 at 15:40
@Shiki Huh? That's strange. It should at least download the page itself without the recursive switch... Just with awget url
– sinni800
Jan 31 '12 at 7:41
add a comment |
I usually download web pages with HTTrack when I know that I'll need that page, and I'll have no internet connection. That's what I wanted to do with Wikispaces page too.
However, HTTrack just can't download it:
Warning: File has moved from elte-ik-linalg.wikispaces.com/ to
https://session.wikispaces.com/1/auth/auth?authToken=0c4659f405444bb5ffd1e1284246a1a87
Info: No data seems to have been transferred during this session! : restoring previous one!
How do I download a Wikispaces page, then?
download webpage httrack
I usually download web pages with HTTrack when I know that I'll need that page, and I'll have no internet connection. That's what I wanted to do with Wikispaces page too.
However, HTTrack just can't download it:
Warning: File has moved from elte-ik-linalg.wikispaces.com/ to
https://session.wikispaces.com/1/auth/auth?authToken=0c4659f405444bb5ffd1e1284246a1a87
Info: No data seems to have been transferred during this session! : restoring previous one!
How do I download a Wikispaces page, then?
download webpage httrack
download webpage httrack
edited Sep 1 '13 at 19:16
nc4pk
7,227114867
7,227114867
asked Jan 19 '12 at 14:19
Shiki
12.6k1782141
12.6k1782141
2
You could trywget
. It has a recursive function..
– sinni800
Jan 19 '12 at 15:10
@sinni800 - Gives an error, cannot download the page. If anyone manages to configure wget properly, then go ahead, I'll accept that as an answer too. Of course.
– Shiki
Jan 19 '12 at 15:32
What's your depth on HTTrack? You can also add in a rule to exclude session.wikispaces.com and it will avoid that domain regardless of depth.
– skub
Jan 19 '12 at 15:40
@Shiki Huh? That's strange. It should at least download the page itself without the recursive switch... Just with awget url
– sinni800
Jan 31 '12 at 7:41
add a comment |
2
You could trywget
. It has a recursive function..
– sinni800
Jan 19 '12 at 15:10
@sinni800 - Gives an error, cannot download the page. If anyone manages to configure wget properly, then go ahead, I'll accept that as an answer too. Of course.
– Shiki
Jan 19 '12 at 15:32
What's your depth on HTTrack? You can also add in a rule to exclude session.wikispaces.com and it will avoid that domain regardless of depth.
– skub
Jan 19 '12 at 15:40
@Shiki Huh? That's strange. It should at least download the page itself without the recursive switch... Just with awget url
– sinni800
Jan 31 '12 at 7:41
2
2
You could try
wget
. It has a recursive function..– sinni800
Jan 19 '12 at 15:10
You could try
wget
. It has a recursive function..– sinni800
Jan 19 '12 at 15:10
@sinni800 - Gives an error, cannot download the page. If anyone manages to configure wget properly, then go ahead, I'll accept that as an answer too. Of course.
– Shiki
Jan 19 '12 at 15:32
@sinni800 - Gives an error, cannot download the page. If anyone manages to configure wget properly, then go ahead, I'll accept that as an answer too. Of course.
– Shiki
Jan 19 '12 at 15:32
What's your depth on HTTrack? You can also add in a rule to exclude session.wikispaces.com and it will avoid that domain regardless of depth.
– skub
Jan 19 '12 at 15:40
What's your depth on HTTrack? You can also add in a rule to exclude session.wikispaces.com and it will avoid that domain regardless of depth.
– skub
Jan 19 '12 at 15:40
@Shiki Huh? That's strange. It should at least download the page itself without the recursive switch... Just with a
wget url
– sinni800
Jan 31 '12 at 7:41
@Shiki Huh? That's strange. It should at least download the page itself without the recursive switch... Just with a
wget url
– sinni800
Jan 31 '12 at 7:41
add a comment |
2 Answers
2
active
oldest
votes
Try this advice (from from BeckyG at the HTTrack forum):
If you log into your WikiSpaces Wiki,
on the upper left-hand side of your WikiSpace click Manage Wiki. If you are
an "organizer" of your Wiki, you should have an Export option if you scroll
down. If you have the Export option, I think you can save the whole Wiki as a
ZIP file. However, if you are not an organizer/owner of that Wiki, you can
still download the contents by clicking the Web Folders (WebDAV) link. It
should show a web address something like this:
YourWikiName.wikispaces.com/space/dav and the last two words in that link are
very imporant. Using WinHTTrack Website Copier, start a new project. On the
Add URL option, paste in your whole WikiSpaces address with /space/dav at the
end. Type your WikiSpaces user name in the Login field (NOT your email
address associated with the Wiki... only your User Name will work) and type in
your WikiSpaces password in the Password field. Click Next and Finish. Your
WikiSpaces site should begin downloading. It won't be "pretty" with images,
but it will provide all the files and documents (the text content) of the
Wiki.
I just tried this method myself. It took a moment or two, but it worked and was straightforward.
I'll try to ask an "organizer" to send me the wiki. I have tried the webDAV one, registered an account to try it out. After entering the URL and the user/pass combo, HTTrack started downloading. However, the log is full of errors (404), and in the end, the content is garbled, got tons of directories, and I just can't find anything. (The character encoding is also wrong.) It's not usable, sadly.
– Shiki
Jan 19 '12 at 16:00
So you do not see the Export option? (It would be at the very bottom of the list if you have it).
– kmote
Jan 19 '12 at 21:03
I tried the webDAV option, since I'm just a member, I just registered for this. To download the Wiki.
– Shiki
Jan 19 '12 at 22:42
add a comment |
I have another method, which I had just posted on the HTTrack forum. I'll rewrite it here:
Let's say you want to mirror http://sitename.wikispaces.com
Create an account on wikispaces. Let's assume the username is "
MyName
"
and the password is "MyPass
".VERY IMPORTANT : You MUST login with the
account!Add a URL with the username and password. You should get something
like this:http://MyName:MyPass@sitename.wikispaces.com/
Add another URL without the username and password, e.g.,
http://sitename.wikispaces.com/
In the Rules for the project, add
+session.wikispaces.com/*
on the bottom line (or second line from the bottom if you use--disable-security-limits
)Exclude
*/space/*
also, to prevent HTTrack from downloading auxilliary information (and the WebDAV files)Start the mirroring process. After approx. 5~15 minutes, stop it, and peruse the (partial) mirror in the hard disk. Add sections you don't want to mirror into the Rules as exclusions.
Restart mirroring.
Hope this helps.
Edit: The first time you start mirroring on Step 7, you might encounter the same error you experienced now. Just restart the mirroring, and all should go well.
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "3"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsuperuser.com%2fquestions%2f380233%2fhow-do-i-download-a-wikispaces-page%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
Try this advice (from from BeckyG at the HTTrack forum):
If you log into your WikiSpaces Wiki,
on the upper left-hand side of your WikiSpace click Manage Wiki. If you are
an "organizer" of your Wiki, you should have an Export option if you scroll
down. If you have the Export option, I think you can save the whole Wiki as a
ZIP file. However, if you are not an organizer/owner of that Wiki, you can
still download the contents by clicking the Web Folders (WebDAV) link. It
should show a web address something like this:
YourWikiName.wikispaces.com/space/dav and the last two words in that link are
very imporant. Using WinHTTrack Website Copier, start a new project. On the
Add URL option, paste in your whole WikiSpaces address with /space/dav at the
end. Type your WikiSpaces user name in the Login field (NOT your email
address associated with the Wiki... only your User Name will work) and type in
your WikiSpaces password in the Password field. Click Next and Finish. Your
WikiSpaces site should begin downloading. It won't be "pretty" with images,
but it will provide all the files and documents (the text content) of the
Wiki.
I just tried this method myself. It took a moment or two, but it worked and was straightforward.
I'll try to ask an "organizer" to send me the wiki. I have tried the webDAV one, registered an account to try it out. After entering the URL and the user/pass combo, HTTrack started downloading. However, the log is full of errors (404), and in the end, the content is garbled, got tons of directories, and I just can't find anything. (The character encoding is also wrong.) It's not usable, sadly.
– Shiki
Jan 19 '12 at 16:00
So you do not see the Export option? (It would be at the very bottom of the list if you have it).
– kmote
Jan 19 '12 at 21:03
I tried the webDAV option, since I'm just a member, I just registered for this. To download the Wiki.
– Shiki
Jan 19 '12 at 22:42
add a comment |
Try this advice (from from BeckyG at the HTTrack forum):
If you log into your WikiSpaces Wiki,
on the upper left-hand side of your WikiSpace click Manage Wiki. If you are
an "organizer" of your Wiki, you should have an Export option if you scroll
down. If you have the Export option, I think you can save the whole Wiki as a
ZIP file. However, if you are not an organizer/owner of that Wiki, you can
still download the contents by clicking the Web Folders (WebDAV) link. It
should show a web address something like this:
YourWikiName.wikispaces.com/space/dav and the last two words in that link are
very imporant. Using WinHTTrack Website Copier, start a new project. On the
Add URL option, paste in your whole WikiSpaces address with /space/dav at the
end. Type your WikiSpaces user name in the Login field (NOT your email
address associated with the Wiki... only your User Name will work) and type in
your WikiSpaces password in the Password field. Click Next and Finish. Your
WikiSpaces site should begin downloading. It won't be "pretty" with images,
but it will provide all the files and documents (the text content) of the
Wiki.
I just tried this method myself. It took a moment or two, but it worked and was straightforward.
I'll try to ask an "organizer" to send me the wiki. I have tried the webDAV one, registered an account to try it out. After entering the URL and the user/pass combo, HTTrack started downloading. However, the log is full of errors (404), and in the end, the content is garbled, got tons of directories, and I just can't find anything. (The character encoding is also wrong.) It's not usable, sadly.
– Shiki
Jan 19 '12 at 16:00
So you do not see the Export option? (It would be at the very bottom of the list if you have it).
– kmote
Jan 19 '12 at 21:03
I tried the webDAV option, since I'm just a member, I just registered for this. To download the Wiki.
– Shiki
Jan 19 '12 at 22:42
add a comment |
Try this advice (from from BeckyG at the HTTrack forum):
If you log into your WikiSpaces Wiki,
on the upper left-hand side of your WikiSpace click Manage Wiki. If you are
an "organizer" of your Wiki, you should have an Export option if you scroll
down. If you have the Export option, I think you can save the whole Wiki as a
ZIP file. However, if you are not an organizer/owner of that Wiki, you can
still download the contents by clicking the Web Folders (WebDAV) link. It
should show a web address something like this:
YourWikiName.wikispaces.com/space/dav and the last two words in that link are
very imporant. Using WinHTTrack Website Copier, start a new project. On the
Add URL option, paste in your whole WikiSpaces address with /space/dav at the
end. Type your WikiSpaces user name in the Login field (NOT your email
address associated with the Wiki... only your User Name will work) and type in
your WikiSpaces password in the Password field. Click Next and Finish. Your
WikiSpaces site should begin downloading. It won't be "pretty" with images,
but it will provide all the files and documents (the text content) of the
Wiki.
I just tried this method myself. It took a moment or two, but it worked and was straightforward.
Try this advice (from from BeckyG at the HTTrack forum):
If you log into your WikiSpaces Wiki,
on the upper left-hand side of your WikiSpace click Manage Wiki. If you are
an "organizer" of your Wiki, you should have an Export option if you scroll
down. If you have the Export option, I think you can save the whole Wiki as a
ZIP file. However, if you are not an organizer/owner of that Wiki, you can
still download the contents by clicking the Web Folders (WebDAV) link. It
should show a web address something like this:
YourWikiName.wikispaces.com/space/dav and the last two words in that link are
very imporant. Using WinHTTrack Website Copier, start a new project. On the
Add URL option, paste in your whole WikiSpaces address with /space/dav at the
end. Type your WikiSpaces user name in the Login field (NOT your email
address associated with the Wiki... only your User Name will work) and type in
your WikiSpaces password in the Password field. Click Next and Finish. Your
WikiSpaces site should begin downloading. It won't be "pretty" with images,
but it will provide all the files and documents (the text content) of the
Wiki.
I just tried this method myself. It took a moment or two, but it worked and was straightforward.
answered Jan 19 '12 at 15:47
kmote
1,91731726
1,91731726
I'll try to ask an "organizer" to send me the wiki. I have tried the webDAV one, registered an account to try it out. After entering the URL and the user/pass combo, HTTrack started downloading. However, the log is full of errors (404), and in the end, the content is garbled, got tons of directories, and I just can't find anything. (The character encoding is also wrong.) It's not usable, sadly.
– Shiki
Jan 19 '12 at 16:00
So you do not see the Export option? (It would be at the very bottom of the list if you have it).
– kmote
Jan 19 '12 at 21:03
I tried the webDAV option, since I'm just a member, I just registered for this. To download the Wiki.
– Shiki
Jan 19 '12 at 22:42
add a comment |
I'll try to ask an "organizer" to send me the wiki. I have tried the webDAV one, registered an account to try it out. After entering the URL and the user/pass combo, HTTrack started downloading. However, the log is full of errors (404), and in the end, the content is garbled, got tons of directories, and I just can't find anything. (The character encoding is also wrong.) It's not usable, sadly.
– Shiki
Jan 19 '12 at 16:00
So you do not see the Export option? (It would be at the very bottom of the list if you have it).
– kmote
Jan 19 '12 at 21:03
I tried the webDAV option, since I'm just a member, I just registered for this. To download the Wiki.
– Shiki
Jan 19 '12 at 22:42
I'll try to ask an "organizer" to send me the wiki. I have tried the webDAV one, registered an account to try it out. After entering the URL and the user/pass combo, HTTrack started downloading. However, the log is full of errors (404), and in the end, the content is garbled, got tons of directories, and I just can't find anything. (The character encoding is also wrong.) It's not usable, sadly.
– Shiki
Jan 19 '12 at 16:00
I'll try to ask an "organizer" to send me the wiki. I have tried the webDAV one, registered an account to try it out. After entering the URL and the user/pass combo, HTTrack started downloading. However, the log is full of errors (404), and in the end, the content is garbled, got tons of directories, and I just can't find anything. (The character encoding is also wrong.) It's not usable, sadly.
– Shiki
Jan 19 '12 at 16:00
So you do not see the Export option? (It would be at the very bottom of the list if you have it).
– kmote
Jan 19 '12 at 21:03
So you do not see the Export option? (It would be at the very bottom of the list if you have it).
– kmote
Jan 19 '12 at 21:03
I tried the webDAV option, since I'm just a member, I just registered for this. To download the Wiki.
– Shiki
Jan 19 '12 at 22:42
I tried the webDAV option, since I'm just a member, I just registered for this. To download the Wiki.
– Shiki
Jan 19 '12 at 22:42
add a comment |
I have another method, which I had just posted on the HTTrack forum. I'll rewrite it here:
Let's say you want to mirror http://sitename.wikispaces.com
Create an account on wikispaces. Let's assume the username is "
MyName
"
and the password is "MyPass
".VERY IMPORTANT : You MUST login with the
account!Add a URL with the username and password. You should get something
like this:http://MyName:MyPass@sitename.wikispaces.com/
Add another URL without the username and password, e.g.,
http://sitename.wikispaces.com/
In the Rules for the project, add
+session.wikispaces.com/*
on the bottom line (or second line from the bottom if you use--disable-security-limits
)Exclude
*/space/*
also, to prevent HTTrack from downloading auxilliary information (and the WebDAV files)Start the mirroring process. After approx. 5~15 minutes, stop it, and peruse the (partial) mirror in the hard disk. Add sections you don't want to mirror into the Rules as exclusions.
Restart mirroring.
Hope this helps.
Edit: The first time you start mirroring on Step 7, you might encounter the same error you experienced now. Just restart the mirroring, and all should go well.
add a comment |
I have another method, which I had just posted on the HTTrack forum. I'll rewrite it here:
Let's say you want to mirror http://sitename.wikispaces.com
Create an account on wikispaces. Let's assume the username is "
MyName
"
and the password is "MyPass
".VERY IMPORTANT : You MUST login with the
account!Add a URL with the username and password. You should get something
like this:http://MyName:MyPass@sitename.wikispaces.com/
Add another URL without the username and password, e.g.,
http://sitename.wikispaces.com/
In the Rules for the project, add
+session.wikispaces.com/*
on the bottom line (or second line from the bottom if you use--disable-security-limits
)Exclude
*/space/*
also, to prevent HTTrack from downloading auxilliary information (and the WebDAV files)Start the mirroring process. After approx. 5~15 minutes, stop it, and peruse the (partial) mirror in the hard disk. Add sections you don't want to mirror into the Rules as exclusions.
Restart mirroring.
Hope this helps.
Edit: The first time you start mirroring on Step 7, you might encounter the same error you experienced now. Just restart the mirroring, and all should go well.
add a comment |
I have another method, which I had just posted on the HTTrack forum. I'll rewrite it here:
Let's say you want to mirror http://sitename.wikispaces.com
Create an account on wikispaces. Let's assume the username is "
MyName
"
and the password is "MyPass
".VERY IMPORTANT : You MUST login with the
account!Add a URL with the username and password. You should get something
like this:http://MyName:MyPass@sitename.wikispaces.com/
Add another URL without the username and password, e.g.,
http://sitename.wikispaces.com/
In the Rules for the project, add
+session.wikispaces.com/*
on the bottom line (or second line from the bottom if you use--disable-security-limits
)Exclude
*/space/*
also, to prevent HTTrack from downloading auxilliary information (and the WebDAV files)Start the mirroring process. After approx. 5~15 minutes, stop it, and peruse the (partial) mirror in the hard disk. Add sections you don't want to mirror into the Rules as exclusions.
Restart mirroring.
Hope this helps.
Edit: The first time you start mirroring on Step 7, you might encounter the same error you experienced now. Just restart the mirroring, and all should go well.
I have another method, which I had just posted on the HTTrack forum. I'll rewrite it here:
Let's say you want to mirror http://sitename.wikispaces.com
Create an account on wikispaces. Let's assume the username is "
MyName
"
and the password is "MyPass
".VERY IMPORTANT : You MUST login with the
account!Add a URL with the username and password. You should get something
like this:http://MyName:MyPass@sitename.wikispaces.com/
Add another URL without the username and password, e.g.,
http://sitename.wikispaces.com/
In the Rules for the project, add
+session.wikispaces.com/*
on the bottom line (or second line from the bottom if you use--disable-security-limits
)Exclude
*/space/*
also, to prevent HTTrack from downloading auxilliary information (and the WebDAV files)Start the mirroring process. After approx. 5~15 minutes, stop it, and peruse the (partial) mirror in the hard disk. Add sections you don't want to mirror into the Rules as exclusions.
Restart mirroring.
Hope this helps.
Edit: The first time you start mirroring on Step 7, you might encounter the same error you experienced now. Just restart the mirroring, and all should go well.
answered Feb 21 '12 at 6:56
pepoluan
7292521
7292521
add a comment |
add a comment |
Thanks for contributing an answer to Super User!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsuperuser.com%2fquestions%2f380233%2fhow-do-i-download-a-wikispaces-page%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
2
You could try
wget
. It has a recursive function..– sinni800
Jan 19 '12 at 15:10
@sinni800 - Gives an error, cannot download the page. If anyone manages to configure wget properly, then go ahead, I'll accept that as an answer too. Of course.
– Shiki
Jan 19 '12 at 15:32
What's your depth on HTTrack? You can also add in a rule to exclude session.wikispaces.com and it will avoid that domain regardless of depth.
– skub
Jan 19 '12 at 15:40
@Shiki Huh? That's strange. It should at least download the page itself without the recursive switch... Just with a
wget url
– sinni800
Jan 31 '12 at 7:41