Posts: 3,986
Threads: 178
Joined: Apr 2022
Reputation:
222
04-29-2022, 06:24 PM
(This post was last modified: 04-29-2022, 06:53 PM by bplus.)
(04-29-2022, 06:16 PM)Pete Wrote: I got rid of the fragmented unwanted auto-mention tags for you.
Pete
Ah much better! Thanks @Pete
Cleans up nice!
Come to think of it, I recall wanting to alert @SMcNeill because I was posting a mod of his code and hoping he might comment with a fix/test of his own but the guy has been pretty busy lately.
That was the reason for one @ have no idea where the other one came from. Wait... in another Tab in my browser I was looking up Members for proper spelling and in this case cAsE and tried to copy / paste from there and that's when all these funny looking things started showing up... Plus all the code in code box got underlined! Yikes almighty!
b = b + ...
Posts: 199
Threads: 15
Joined: Apr 2022
Reputation:
4
(04-29-2022, 05:33 PM)bplus Wrote: I hesitate to post this but @SMcNeill had posted something awhile ago about accessing the QB64 Wiki for a copy in case it went down, here is my copy and you will at the very least need to update the address but I found it helpful when I wanted to make a list of QB64 command words both underlined and not but no GL.
So this will help get contents and then you can mod as you see fit?
Code: (Select All) Const SleepSet = 0 'to sleep between each keyword so we can see them and read them
Const SaveTo = "wiki.txt" '"SCRN:" to display to the screen, or insert a filename to save the list to disk
Screen _NewImage(1024, 720, 256)
Print "DOWNLOAD STARTING"
file$ = "QB64WikiList.txt"
'Increase the 2-second time limit here if you need to.
DownloadRSS "http://qb64.org/wiki/Keyword_Reference_-_Alphabetical", file$, 2
Print "DOWNLOAD FINISHED"
Print
Open file$ For Binary As #1
temp$ = Space$(LOF(1))
Get #1, , temp$
Close #1
l = 0
finish$ = "<div id=" + Chr$(34) + "symbols" + Chr$(34) + "></div>"
finish = InStr(temp$, finish$) 'No need to parse anything after we get down to the symbol section of the wiki page
Open SaveTo For Output As #1
Do
l = InStr(l, temp$, "<li>")
If l >= finish Then Exit Do
If l Then
l2 = InStr(l + 5, temp$, "</li>"): If l2 = 0 Then l2 = Len(temp$)
work$ = Mid$(temp$, l + 4, l2 - l - 5)
li1 = InStr(work$, Chr$(34)): li2 = InStr(li1 + 1, work$, Chr$(34))
link$ = Mid$(work$, li1 + 1, li2 - li1 - 1)
If InStr(link$, "&") Then link$ = "Page does not exist yet."
link$ = StripCRLF(link$)
k1 = InStr(li2 + 1, work$, ">"): k2 = InStr(k1 + 1, work$, "<")
keyword$ = Mid$(work$, k1 + 1, k2 - k1 - 1)
keyword$ = StripCRLF(keyword$)
d = InStr(k2 + 1, work$, "<span"): d1 = InStr(d + 1, work$, ">"): d2 = InStr(d1 + 1, work$, "</span")
desc$ = Mid$(work$, d1 + 1, d2 - d1 - 1)
desc$ = StripCRLF(desc$)
Print #1, keyword$
Print #1, " http://qb64.org"; link$
Print #1, " "; desc$
If SleepSet Then Sleep
l = l2 + 1
End If
Loop Until l = 0
Close
Sleep
System
Sub DownloadRSS (url$, file$, timelimit)
link$ = url$
Dim l As _Integer64, lod As _Integer64
url2$ = RTrim$(LTrim$(link$))
url4$ = RTrim$(LTrim$(link$))
If Left$(UCase$(url2$), 7) = "HTTP://" Then url4$ = Mid$(url2$, 8)
x = InStr(url4$, "/")
If x Then url2$ = Left$(url4$, x - 1)
NewsClient = _OpenClient("TCP/IP:80:" + url2$)
If NewsClient = 0 Then Exit Sub
e$ = Chr$(13) + Chr$(10) ' end of line characters
url3$ = Right$(url4$, Len(url4$) - x + 1)
x$ = "GET " + url3$ + " HTTP/1.1" + e$
x$ = x$ + "Host: " + url2$ + e$ + e$
Put #NewsClient, , x$
Open file$ For Output As #1: Close #1
Open file$ For Binary As #1
t! = Timer ' start time
head$ = ""
cont_type$ = ""
Do
_Limit 20
Get #NewsClient, , a$
If LTrim$(a$) > "" Then Put #1, , a$
Loop Until Timer > t! + timelimit And timelimit > 0 ' (in seconds)
Close #NewsClient
Close #1
End Sub
Function StripCRLF$ (text$)
'The wiki seems to contain stray CRLF characters at the dangest spots.
'Why it has them, I don't know, but we need to filter them out so our information will load and display properly.
li1 = 0
Do
li1 = InStr(li1 + 1, text$, Chr$(13) + Chr$(10))
If li1 Then
l$ = Left$(text$, li1 - 1)
r$ = Mid$(text$, li1 + 2)
text$ = l$ + r$
End If
Loop Until li1 = 0
'Also some of the descriptions and such contain links to different keywords.
'We want to just strip those links and use a normal word replacement for ease of display, since we're not going to be displaying in
'an html editor/viewer.
li1 = 0
Do
li1 = InStr(li1 + 1, text$, "<a href")
If li1 Then
li2 = InStr(li1 + 1, text$, "</a>")
li3 = InStr(li1 + 1, text$, ">")
l$ = Left$(text$, li1 - 1)
m$ = Mid$(text$, li3 + 1, li2 - li3 - 1)
r$ = Mid$(text$, li2 + 4)
text$ = l$ + m$ + r$
End If
Loop Until li1 = 0
StripCRLF$ = text$
End Function
I have an execution error:
line 62, illegal fonction call
NewsClient = _OpenClient("TCP/IP:80:" + url2$)
it must be for window this code...
thanks anyway
Posts: 2,186
Threads: 222
Joined: Apr 2022
Reputation:
104
@bplus That settles it. I'm never shaving again!
@Coolman bplus used to use Windows. He's using Linux now. What are you using? If Mac, I'm not sure if anyone here is actively using a Mac, now.
Pete
Shoot first and shoot people who ask questions, later.
Posts: 3,986
Threads: 178
Joined: Apr 2022
Reputation:
222
04-29-2022, 07:34 PM
(This post was last modified: 04-29-2022, 07:39 PM by bplus.)
(04-29-2022, 06:54 PM)Coolman Wrote: (04-29-2022, 05:33 PM)bplus Wrote: I hesitate to post this but @SMcNeill had posted something awhile ago about accessing the QB64 Wiki for a copy in case it went down, here is my copy and you will at the very least need to update the address but I found it helpful when I wanted to make a list of QB64 command words both underlined and not but no GL.
So this will help get contents and then you can mod as you see fit?
Code: (Select All) Const SleepSet = 0 'to sleep between each keyword so we can see them and read them
Const SaveTo = "wiki.txt" '"SCRN:" to display to the screen, or insert a filename to save the list to disk
Screen _NewImage(1024, 720, 256)
Print "DOWNLOAD STARTING"
file$ = "QB64WikiList.txt"
'Increase the 2-second time limit here if you need to.
DownloadRSS "http://qb64.org/wiki/Keyword_Reference_-_Alphabetical", file$, 2
Print "DOWNLOAD FINISHED"
Print
Open file$ For Binary As #1
temp$ = Space$(LOF(1))
Get #1, , temp$
Close #1
l = 0
finish$ = "<div id=" + Chr$(34) + "symbols" + Chr$(34) + "></div>"
finish = InStr(temp$, finish$) 'No need to parse anything after we get down to the symbol section of the wiki page
Open SaveTo For Output As #1
Do
l = InStr(l, temp$, "<li>")
If l >= finish Then Exit Do
If l Then
l2 = InStr(l + 5, temp$, "</li>"): If l2 = 0 Then l2 = Len(temp$)
work$ = Mid$(temp$, l + 4, l2 - l - 5)
li1 = InStr(work$, Chr$(34)): li2 = InStr(li1 + 1, work$, Chr$(34))
link$ = Mid$(work$, li1 + 1, li2 - li1 - 1)
If InStr(link$, "&") Then link$ = "Page does not exist yet."
link$ = StripCRLF(link$)
k1 = InStr(li2 + 1, work$, ">"): k2 = InStr(k1 + 1, work$, "<")
keyword$ = Mid$(work$, k1 + 1, k2 - k1 - 1)
keyword$ = StripCRLF(keyword$)
d = InStr(k2 + 1, work$, "<span"): d1 = InStr(d + 1, work$, ">"): d2 = InStr(d1 + 1, work$, "</span")
desc$ = Mid$(work$, d1 + 1, d2 - d1 - 1)
desc$ = StripCRLF(desc$)
Print #1, keyword$
Print #1, " http://qb64.org"; link$
Print #1, " "; desc$
If SleepSet Then Sleep
l = l2 + 1
End If
Loop Until l = 0
Close
Sleep
System
Sub DownloadRSS (url$, file$, timelimit)
link$ = url$
Dim l As _Integer64, lod As _Integer64
url2$ = RTrim$(LTrim$(link$))
url4$ = RTrim$(LTrim$(link$))
If Left$(UCase$(url2$), 7) = "HTTP://" Then url4$ = Mid$(url2$, 8)
x = InStr(url4$, "/")
If x Then url2$ = Left$(url4$, x - 1)
NewsClient = _OpenClient("TCP/IP:80:" + url2$)
If NewsClient = 0 Then Exit Sub
e$ = Chr$(13) + Chr$(10) ' end of line characters
url3$ = Right$(url4$, Len(url4$) - x + 1)
x$ = "GET " + url3$ + " HTTP/1.1" + e$
x$ = x$ + "Host: " + url2$ + e$ + e$
Put #NewsClient, , x$
Open file$ For Output As #1: Close #1
Open file$ For Binary As #1
t! = Timer ' start time
head$ = ""
cont_type$ = ""
Do
_Limit 20
Get #NewsClient, , a$
If LTrim$(a$) > "" Then Put #1, , a$
Loop Until Timer > t! + timelimit And timelimit > 0 ' (in seconds)
Close #NewsClient
Close #1
End Sub
Function StripCRLF$ (text$)
'The wiki seems to contain stray CRLF characters at the dangest spots.
'Why it has them, I don't know, but we need to filter them out so our information will load and display properly.
li1 = 0
Do
li1 = InStr(li1 + 1, text$, Chr$(13) + Chr$(10))
If li1 Then
l$ = Left$(text$, li1 - 1)
r$ = Mid$(text$, li1 + 2)
text$ = l$ + r$
End If
Loop Until li1 = 0
'Also some of the descriptions and such contain links to different keywords.
'We want to just strip those links and use a normal word replacement for ease of display, since we're not going to be displaying in
'an html editor/viewer.
li1 = 0
Do
li1 = InStr(li1 + 1, text$, "<a href")
If li1 Then
li2 = InStr(li1 + 1, text$, "</a>")
li3 = InStr(li1 + 1, text$, ">")
l$ = Left$(text$, li1 - 1)
m$ = Mid$(text$, li3 + 1, li2 - li3 - 1)
r$ = Mid$(text$, li2 + 4)
text$ = l$ + m$ + r$
End If
Loop Until li1 = 0
StripCRLF$ = text$
End Function
I have an execution error:
line 62, illegal fonction call
NewsClient = _OpenClient("TCP/IP:80:" + url2$)
it must be for window this code...
thanks anyway
Hi @Coolman,
As I had warned, you will need at least the new url (Internet link) for this:
"http://qb64.org/wiki/Keyword_Reference_-_Alphabetical"
Remember this has been changed since old Wiki for QB64 was locked. Check Wiki Board here for new Wiki link and you might want to start at Home page instead of the Keyword_Reference. This is why I was trying to alert Steve.
Hey you can probably get the correct url by browser to new QB64pe Wiki and copy / paste that?!
Ya! this take you to main page
https://qb64phoenix.com/qb64wiki/index.php/Main_Page
This take you to Keywords page
https://qb64phoenix.com/qb64wiki/index.p...phabetical
Here is link to Keywords by usage, very helpful if you don't know the word but do know what function it provides
https://qb64phoenix.com/qb64wiki/index.p...-_By_usage
b = b + ...
Posts: 199
Threads: 15
Joined: Apr 2022
Reputation:
4
04-29-2022, 09:59 PM
(This post was last modified: 04-29-2022, 10:01 PM by Coolman.)
I specify that I use linux (ubuntu)...
this is what I had done, does this code work for you under linux.
Code: (Select All) Const SleepSet = 0 'to sleep between each keyword so we can see them and read them
Const SaveTo = "wiki.txt" '"SCRN:" to display to the screen, or insert a filename to save the list to disk
Screen _NewImage(1024, 720, 256)
Print "DOWNLOAD STARTING"
file$ = "QB64WikiList.txt"
'Increase the 2-second time limit here if you need to.
DownloadRSS "https://qb64phoenix.com/qb64wiki/index.php/Keyword_Reference_-_Alphabetical", file$, 2
Print "DOWNLOAD FINISHED"
Print
Open file$ For Binary As #1
temp$ = Space$(LOF(1))
Get #1, , temp$
Close #1
l = 0
finish$ = "<div id=" + Chr$(34) + "symbols" + Chr$(34) + "></div>"
finish = InStr(temp$, finish$) 'No need to parse anything after we get down to the symbol section of the wiki page
Open SaveTo For Output As #1
Do
l = InStr(l, temp$, "<li>")
If l >= finish Then Exit Do
If l Then
l2 = InStr(l + 5, temp$, "</li>"): If l2 = 0 Then l2 = Len(temp$)
work$ = Mid$(temp$, l + 4, l2 - l - 5)
li1 = InStr(work$, Chr$(34)): li2 = InStr(li1 + 1, work$, Chr$(34))
link$ = Mid$(work$, li1 + 1, li2 - li1 - 1)
If InStr(link$, "&") Then link$ = "Page does not exist yet."
link$ = StripCRLF(link$)
k1 = InStr(li2 + 1, work$, ">"): k2 = InStr(k1 + 1, work$, "<")
keyword$ = Mid$(work$, k1 + 1, k2 - k1 - 1)
keyword$ = StripCRLF(keyword$)
d = InStr(k2 + 1, work$, "<span"): d1 = InStr(d + 1, work$, ">"): d2 = InStr(d1 + 1, work$, "</span")
desc$ = Mid$(work$, d1 + 1, d2 - d1 - 1)
desc$ = StripCRLF(desc$)
Print #1, keyword$
Print #1, " http://qb64.org"; link$
Print #1, " "; desc$
If SleepSet Then Sleep
l = l2 + 1
End If
Loop Until l = 0
Close
Sleep
System
Sub DownloadRSS (url$, file$, timelimit)
link$ = url$
Dim l As _Integer64, lod As _Integer64
url2$ = RTrim$(LTrim$(link$))
url4$ = RTrim$(LTrim$(link$))
If Left$(UCase$(url2$), 7) = "HTTP://" Then url4$ = Mid$(url2$, 8)
x = InStr(url4$, "/")
If x Then url2$ = Left$(url4$, x - 1)
NewsClient = _OpenClient("TCP/IP:80:" + url2$)
If NewsClient = 0 Then Exit Sub
e$ = Chr$(13) + Chr$(10) ' end of line characters
url3$ = Right$(url4$, Len(url4$) - x + 1)
x$ = "GET " + url3$ + " HTTP/1.1" + e$
x$ = x$ + "Host: " + url2$ + e$ + e$
Put #NewsClient, , x$
Open file$ For Output As #1: Close #1
Open file$ For Binary As #1
t! = Timer ' start time
head$ = ""
cont_type$ = ""
Do
_Limit 20
Get #NewsClient, , a$
If LTrim$(a$) > "" Then Put #1, , a$
Loop Until Timer > t! + timelimit And timelimit > 0 ' (in seconds)
Close #NewsClient
Close #1
End Sub
Function StripCRLF$ (text$)
'The wiki seems to contain stray CRLF characters at the dangest spots.
'Why it has them, I don't know, but we need to filter them out so our information will load and display properly.
li1 = 0
Do
li1 = InStr(li1 + 1, text$, Chr$(13) + Chr$(10))
If li1 Then
l$ = Left$(text$, li1 - 1)
r$ = Mid$(text$, li1 + 2)
text$ = l$ + r$
End If
Loop Until li1 = 0
'Also some of the descriptions and such contain links to different keywords.
'We want to just strip those links and use a normal word replacement for ease of display, since we're not going to be displaying in
'an html editor/viewer.
li1 = 0
Do
li1 = InStr(li1 + 1, text$, "<a href")
If li1 Then
li2 = InStr(li1 + 1, text$, "</a>")
li3 = InStr(li1 + 1, text$, ">")
l$ = Left$(text$, li1 - 1)
m$ = Mid$(text$, li3 + 1, li2 - li3 - 1)
r$ = Mid$(text$, li2 + 4)
text$ = l$ + m$ + r$
End If
Loop Until li1 = 0
StripCRLF$ = text$
End Function
Posts: 2,698
Threads: 328
Joined: Apr 2022
Reputation:
218
Try the http link, @Coolman, rather than the https. You'll need to use curl, or such, to download via https.
Posts: 199
Threads: 15
Joined: Apr 2022
Reputation:
4
thank you. yes, that was the problem. it works now. finally, i used the Print Edit WE extension of firefox to generate pdfs files...
Posts: 199
Threads: 15
Joined: Apr 2022
Reputation:
4
hello. does anyone have a copy of the old qb64 wiki and can post a compressed archive. i could convert the files to pdf. i already did that with the copy of the old forum. i got about 1.5 gb of data in which i can do keyword searches with the excellent file browser Dolphin (kde)...
Posts: 3,986
Threads: 178
Joined: Apr 2022
Reputation:
222
04-30-2022, 05:59 PM
(This post was last modified: 04-30-2022, 06:00 PM by bplus.)
(04-30-2022, 12:02 PM)Coolman Wrote: hello. does anyone have a copy of the old qb64 wiki and can post a compressed archive. i could convert the files to pdf. i already did that with the copy of the old forum. i got about 1.5 gb of data in which i can do keyword searches with the excellent file browser Dolphin (kde)...
I have HTML files of Keywords (no GL's) from code I showed you earlier in this thread.
It is 145 MB. 7264 files and dated 2/24/2018 that I can pass as a zip I am pretty sure.
b = b + ...
Posts: 597
Threads: 110
Joined: Apr 2022
Reputation:
34
(04-29-2022, 06:24 PM)bplus Wrote: (04-29-2022, 06:16 PM)Pete Wrote: I got rid of the fragmented unwanted auto-mention tags for you.
Pete
Ah much better! Thanks @Pete
Cleans up nice!
Baby face
You've got the cutest little
Baby face
There's not another who could take your place
Baby face
(Yeah, I don't relate to the rest of the lyrics...)
|