• [LINK] Google begins requiring JavaScript for Google Search

    From Computer Nerd Kev@21:1/5 to All on Tue Jan 21 07:10:00 2025
    Google begins requiring JavaScript for Google Search
    by Thom Holwerda 2025-01-18
    - https://www.osnews.com/story/141570/google-begins-requiring-javascript-for-google-search/

    " Google says it has begun requiring users to turn on JavaScript, the
    widely used programming language to make web pages interactive, in
    order to use Google Search.
    In an email to TechCrunch, a company spokesperson claimed that the
    change is intended to "better protect" Google Search against
    malicious activity, such as bots and spam, and to improve the
    overall Google Search experience for users. The spokesperson noted
    that, without JavaScript, many Google Search features won't work
    properly and that the quality of search results tends to be
    degraded.
    Kyle Wiggers at TechCrunch

    One of the strangely odd compliments you could give Google Search
    is that it would load even on the weirdest or oldest browsers,
    simply because it didn't require JavaScript. Whether I loaded
    Google Search in the JS-less Dillo, Blazer on PalmOS, or the latest
    Firefox, I'd end up with a search box I could type something into
    and search. Sure, beyond that the web would be, shall we say,
    problematic, but at least Google Search worked. With this move,
    Google will end such compatibility, which was most likely a side
    effect more than policy." ...

    I switched from Google to Duck Duck Go (Lite) many years ago, but
    it's annoying that I'll have to find another search engine to use
    as a fall-back for when DDG breaks, since I do most of my Web
    browsing in Dillo.

    --
    __ __
    #_ < |\| |< _#

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From D@21:1/5 to Computer Nerd Kev on Tue Jan 21 10:23:02 2025
    On Mon, 21 Jan 2025, Computer Nerd Kev wrote:

    Google begins requiring JavaScript for Google Search
    by Thom Holwerda 2025-01-18
    - https://www.osnews.com/story/141570/google-begins-requiring-javascript-for-google-search/

    " Google says it has begun requiring users to turn on JavaScript, the
    widely used programming language to make web pages interactive, in
    order to use Google Search.
    In an email to TechCrunch, a company spokesperson claimed that the
    change is intended to "better protect" Google Search against
    malicious activity, such as bots and spam, and to improve the
    overall Google Search experience for users. The spokesperson noted
    that, without JavaScript, many Google Search features won't work
    properly and that the quality of search results tends to be
    degraded.
    Kyle Wiggers at TechCrunch

    One of the strangely odd compliments you could give Google Search
    is that it would load even on the weirdest or oldest browsers,
    simply because it didn't require JavaScript. Whether I loaded
    Google Search in the JS-less Dillo, Blazer on PalmOS, or the latest
    Firefox, I'd end up with a search box I could type something into
    and search. Sure, beyond that the web would be, shall we say,
    problematic, but at least Google Search worked. With this move,
    Google will end such compatibility, which was most likely a side
    effect more than policy." ...

    I switched from Google to Duck Duck Go (Lite) many years ago, but
    it's annoying that I'll have to find another search engine to use
    as a fall-back for when DDG breaks, since I do most of my Web
    browsing in Dillo.

    Can't you use the !g on ddg? Maybe ddg sanitizes the google output a bit?
    If that doesn't work, I recommend startpage.com which is just an interface
    to google.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jan van den Broek@21:1/5 to Sylvia Else on Tue Jan 21 12:56:41 2025
    2025-01-21, Sylvia Else <sylvia@email.invalid> schrieb:

    [Schnipp]

    How is this going to '"better protect" Google Search against malicious activity, such as bots and spam'?

    Sylvia.

    Simple, it won't, but it sounds nice.

    --
    Jan van den Broek
    balglaas@dds.nl 0xAFDAD00D
    http://huizen.dds.nl/~balglaas/

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Sylvia Else@21:1/5 to Computer Nerd Kev on Tue Jan 21 20:18:45 2025
    On 21-Jan-25 5:10 am, Computer Nerd Kev wrote:
    Google begins requiring JavaScript for Google Search
    by Thom Holwerda 2025-01-18
    - https://www.osnews.com/story/141570/google-begins-requiring-javascript-for-google-search/

    " Google says it has begun requiring users to turn on JavaScript, the
    widely used programming language to make web pages interactive, in
    order to use Google Search.
    In an email to TechCrunch, a company spokesperson claimed that the
    change is intended to "better protect" Google Search against
    malicious activity, such as bots and spam, and to improve the
    overall Google Search experience for users. The spokesperson noted
    that, without JavaScript, many Google Search features won't work
    properly and that the quality of search results tends to be
    degraded.
    Kyle Wiggers at TechCrunch

    One of the strangely odd compliments you could give Google Search
    is that it would load even on the weirdest or oldest browsers,
    simply because it didn't require JavaScript. Whether I loaded
    Google Search in the JS-less Dillo, Blazer on PalmOS, or the latest
    Firefox, I'd end up with a search box I could type something into
    and search. Sure, beyond that the web would be, shall we say,
    problematic, but at least Google Search worked. With this move,
    Google will end such compatibility, which was most likely a side
    effect more than policy." ...

    I switched from Google to Duck Duck Go (Lite) many years ago, but
    it's annoying that I'll have to find another search engine to use
    as a fall-back for when DDG breaks, since I do most of my Web
    browsing in Dillo.


    How is this going to '"better protect" Google Search against malicious activity, such as bots and spam'?

    Sylvia.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Salvador Mirzo@21:1/5 to Sylvia Else on Tue Jan 21 10:33:26 2025
    Sylvia Else <sylvia@email.invalid> writes:

    On 21-Jan-25 5:10 am, Computer Nerd Kev wrote:
    Google begins requiring JavaScript for Google Search
    by Thom Holwerda 2025-01-18
    - https://www.osnews.com/story/141570/google-begins-requiring-javascript-for-google-search/
    " Google says it has begun requiring users to turn on JavaScript,
    the
    widely used programming language to make web pages interactive, in
    order to use Google Search.
    In an email to TechCrunch, a company spokesperson claimed that the
    change is intended to "better protect" Google Search against
    malicious activity, such as bots and spam, and to improve the
    overall Google Search experience for users.

    [...]

    How is this going to '"better protect" Google Search against malicious activity, such as bots and spam'?

    I believe the idea is that if the robot doesn't speak Javascript, it's
    an easy denial by the web server. And making bots speak Javascript is
    one step up. And with Javascript they can likely monitor things like
    mouse movement to detect whether the user is a human or a robot.

    I'm not approving the idea; just sharing what I think they might have in
    mind when they say Javascript will help them fend off robots.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jim Jackson@21:1/5 to Computer Nerd Kev on Tue Jan 21 15:30:19 2025
    Oh the irony ... it's ok to scrape everybody else's content to train its AI/News products, but how dare anyone else try the same to us!

    On 2025-01-20, Computer Nerd Kev <not@telling.you.invalid> wrote:
    Google begins requiring JavaScript for Google Search
    by Thom Holwerda 2025-01-18
    - https://www.osnews.com/story/141570/google-begins-requiring-javascript-for-google-search/

    " Google says it has begun requiring users to turn on JavaScript, the
    widely used programming language to make web pages interactive, in
    order to use Google Search.
    In an email to TechCrunch, a company spokesperson claimed that the
    change is intended to "better protect" Google Search against
    malicious activity, such as bots and spam, and to improve the
    overall Google Search experience for users. The spokesperson noted
    that, without JavaScript, many Google Search features won't work
    properly and that the quality of search results tends to be
    degraded.
    Kyle Wiggers at TechCrunch

    One of the strangely odd compliments you could give Google Search
    is that it would load even on the weirdest or oldest browsers,
    simply because it didn't require JavaScript. Whether I loaded
    Google Search in the JS-less Dillo, Blazer on PalmOS, or the latest
    Firefox, I'd end up with a search box I could type something into
    and search. Sure, beyond that the web would be, shall we say,
    problematic, but at least Google Search worked. With this move,
    Google will end such compatibility, which was most likely a side
    effect more than policy." ...

    I switched from Google to Duck Duck Go (Lite) many years ago, but
    it's annoying that I'll have to find another search engine to use
    as a fall-back for when DDG breaks, since I do most of my Web
    browsing in Dillo.


    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From D@21:1/5 to Jim Jackson on Tue Jan 21 16:26:10 2025
    On Tue, 21 Jan 2025 15:30:19 -0000 (UTC), Jim Jackson <jj@franjam.org.uk> wrote:
    Oh the irony ... it's ok to scrape everybody else's content to train its >AI/News products, but how dare anyone else try the same to us!

    the supreme council have wanted their infallible "nanny state" to become
    an actual planet-wide reality . . . and by now they've just about got it

    if people really do get the government they deserve, then the government
    really do get the people they deserve . . what goes around, comes around

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Rich@21:1/5 to Salvador Mirzo on Tue Jan 21 18:12:08 2025
    Salvador Mirzo <smirzo@example.com> wrote:
    Sylvia Else <sylvia@email.invalid> writes:

    On 21-Jan-25 5:10 am, Computer Nerd Kev wrote:
    Google begins requiring JavaScript for Google Search
    by Thom Holwerda 2025-01-18
    -
    https://www.osnews.com/story/141570/google-begins-requiring-javascript-for-google-search/
    " Google says it has begun requiring users to turn on JavaScript,
    the widely used programming language to make web pages
    interactive, in order to use Google Search. In an email to
    TechCrunch, a company spokesperson claimed that the change is
    intended to "better protect" Google Search against malicious
    activity, such as bots and spam, and to improve the overall
    Google Search experience for users.

    [...]

    How is this going to '"better protect" Google Search against
    malicious activity, such as bots and spam'?

    I believe the idea is that if the robot doesn't speak Javascript,
    it's an easy denial by the web server. And making bots speak
    Javascript is one step up. And with Javascript they can likely
    monitor things like mouse movement to detect whether the user is a
    human or a robot.

    I'm not approving the idea; just sharing what I think they might have
    in mind when they say Javascript will help them fend off robots.

    Yes, this is probably the 'excuse' they would offer up if pressed (for
    bots -- for SPAM, no idea).

    But the part they forget is that the reason they have such a 'bots'
    problem is the revenue the bot authors can obtain by gaming google
    search. All requiring JS will do is result in those same scammers
    "building a better bot" -- i.e., the revenue stream is enough they will
    put in the effort to make their bots speak JS, and google will be back
    where they started.

    The *real* reason, which they will likely never admit to, is likely
    that the advertising overlords in control of what is left of the old
    "don't be evil" google figured out they can gain more "data" on users
    by requiring JS than not, and so the change is solely to hoover up more
    data and gain more ad dollars for the mothership.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From D@21:1/5 to Jim Jackson on Tue Jan 21 19:21:34 2025
    On Tue, 21 Jan 2025, Jim Jackson wrote:


    Oh the irony ... it's ok to scrape everybody else's content to train its AI/News products, but how dare anyone else try the same to us!

    Jim, you're forgetting something. It is also ok to serve up copyrighted material such as movies and tv series to the public witout permission of
    the owner on youtube, and not get any fines at all!

    Woe unto you, if you do that as a small business owner, then the IP
    lawyers will be after you.

    On 2025-01-20, Computer Nerd Kev <not@telling.you.invalid> wrote:
    Google begins requiring JavaScript for Google Search
    by Thom Holwerda 2025-01-18
    - https://www.osnews.com/story/141570/google-begins-requiring-javascript-for-google-search/

    " Google says it has begun requiring users to turn on JavaScript, the
    widely used programming language to make web pages interactive, in
    order to use Google Search.
    In an email to TechCrunch, a company spokesperson claimed that the
    change is intended to "better protect" Google Search against
    malicious activity, such as bots and spam, and to improve the
    overall Google Search experience for users. The spokesperson noted
    that, without JavaScript, many Google Search features won't work
    properly and that the quality of search results tends to be
    degraded.
    Kyle Wiggers at TechCrunch

    One of the strangely odd compliments you could give Google Search
    is that it would load even on the weirdest or oldest browsers,
    simply because it didn't require JavaScript. Whether I loaded
    Google Search in the JS-less Dillo, Blazer on PalmOS, or the latest
    Firefox, I'd end up with a search box I could type something into
    and search. Sure, beyond that the web would be, shall we say,
    problematic, but at least Google Search worked. With this move,
    Google will end such compatibility, which was most likely a side
    effect more than policy." ...

    I switched from Google to Duck Duck Go (Lite) many years ago, but
    it's annoying that I'll have to find another search engine to use
    as a fall-back for when DDG breaks, since I do most of my Web
    browsing in Dillo.



    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From D@21:1/5 to Jan van den Broek on Tue Jan 21 19:18:55 2025
    On Tue, 21 Jan 2025, Jan van den Broek wrote:

    2025-01-21, Sylvia Else <sylvia@email.invalid> schrieb:

    [Schnipp]

    How is this going to '"better protect" Google Search against malicious
    activity, such as bots and spam'?

    Sylvia.

    Simple, it won't, but it sounds nice.

    This is the truth!

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andy Burns@21:1/5 to Computer Nerd Kev on Tue Jan 21 19:54:15 2025
    Computer Nerd Kev wrote:

    " Google says it has begun requiring users to turn on JavaScript, the
    widely used programming language to make web pages interactive, in
    order to use Google Search.

    Text search still works with JS disabled, but I think image/map/video
    search have required JS for some time, shopping search is pot luck as
    you only see descriptions with no images, you can't use verbatim or
    date-range searches unless you know how to manipulate the query
    parameters in the URL

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Computer Nerd Kev@21:1/5 to Andy Burns on Wed Jan 22 06:47:31 2025
    Andy Burns <usenet@andyburns.uk> wrote:
    Computer Nerd Kev wrote:

    " Google says it has begun requiring users to turn on JavaScript, the
    widely used programming language to make web pages interactive, in
    order to use Google Search.

    Text search still works with JS disabled,

    Before posting I tried a Google search in Dillo and was redirected
    to a page saying:
    "Turn on JavaScript to keep searching"

    Same thing today.

    Apparantly some specific user-agents might be excepted, although
    that seems inconsistent with aiming to block bots since it's an
    obvious solution for them too.

    but I think image/map/video search have required JS for some
    time

    Video and image searches didn't need JS and are actually still
    working in Dillo even though full web searches are denied. My guess
    is that they'll roll the redirects out to them as well before long
    though.

    --
    __ __
    #_ < |\| |< _#

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jerry Peters@21:1/5 to Computer Nerd Kev on Tue Jan 21 20:57:42 2025
    Computer Nerd Kev <not@telling.you.invalid> wrote:
    Google begins requiring JavaScript for Google Search
    by Thom Holwerda 2025-01-18
    - https://www.osnews.com/story/141570/google-begins-requiring-javascript-for-google-search/

    " Google says it has begun requiring users to turn on JavaScript, the
    widely used programming language to make web pages interactive, in
    order to use Google Search.
    In an email to TechCrunch, a company spokesperson claimed that the
    change is intended to "better protect" Google Search against
    malicious activity, such as bots and spam, and to improve the
    overall Google Search experience for users. The spokesperson noted
    that, without JavaScript, many Google Search features won't work
    properly and that the quality of search results tends to be
    degraded.
    Kyle Wiggers at TechCrunch

    I don't want many Google Search features, I find them annoying,
    especially their "suggestions".
    As for the "quality of search results", I mostly use duckduckgo
    because google search results generally suck.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Computer Nerd Kev@21:1/5 to Salvador Mirzo on Wed Jan 22 06:56:48 2025
    Salvador Mirzo <smirzo@example.com> wrote:
    Sylvia Else <sylvia@email.invalid> writes:
    How is this going to '"better protect" Google Search against malicious
    activity, such as bots and spam'?

    I believe the idea is that if the robot doesn't speak Javascript, it's
    an easy denial by the web server. And making bots speak Javascript is
    one step up. And with Javascript they can likely monitor things like
    mouse movement to detect whether the user is a human or a robot.

    Which of course is one of Google's main businesses, with their
    Captchas that don't always need to show a puzzle in order to
    validate users as humans. So if anyone _thinks_ they can achieve
    that, you'd expect it to be Google.

    --
    __ __
    #_ < |\| |< _#

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From yeti@21:1/5 to Computer Nerd Kev on Tue Jan 21 22:49:26 2025
    not@telling.you.invalid (Computer Nerd Kev) wrote:

    Before posting I tried a Google search in Dillo and was redirected
    to a page saying:
    "Turn on JavaScript to keep searching"

    Same thing today.

    ~$ grep ELinks .dillo/dillorc
    ## "ELinks/0.18.0 (textmode; Linux 5.10.0-33-amd64 x86_64; 102x36-2)" http_user_agent="ELinks/0.18.0 (textmode; Linux)"

    --
    I do not bite, I just want to play.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Computer Nerd Kev@21:1/5 to yeti on Wed Jan 22 13:04:02 2025
    yeti <yeti@tilde.institute> wrote:
    not@telling.you.invalid (Computer Nerd Kev) wrote:

    Before posting I tried a Google search in Dillo and was redirected
    to a page saying:
    "Turn on JavaScript to keep searching"

    Same thing today.

    ~$ grep ELinks .dillo/dillorc
    ## "ELinks/0.18.0 (textmode; Linux 5.10.0-33-amd64 x86_64; 102x36-2)" http_user_agent="ELinks/0.18.0 (textmode; Linux)"

    But now nobody knows you're using Dillo in the first place! What
    incentive do website makers have to consider Dillo users if they're
    all pretending to use other browsers?

    I think it's shooting yourself in the foot. True, websites have
    been machine gunning me down for years in spite of my honest Dillo
    user-agent, but at least I'm not part of the problem.

    --
    __ __
    #_ < |\| |< _#

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From yeti@21:1/5 to Computer Nerd Kev on Wed Jan 22 12:58:19 2025
    Computer Nerd Kev <not@telling.you.invalid> wrote:

    yeti <yeti@tilde.institute> wrote:
    not@telling.you.invalid (Computer Nerd Kev) wrote:

    Before posting I tried a Google search in Dillo and was redirected
    to a page saying:
    "Turn on JavaScript to keep searching"

    Same thing today.

    ~$ grep ELinks .dillo/dillorc
    ## "ELinks/0.18.0 (textmode; Linux 5.10.0-33-amd64 x86_64; 102x36-2)"
    http_user_agent="ELinks/0.18.0 (textmode; Linux)"

    But now nobody knows you're using Dillo in the first place! What
    incentive do website makers have to consider Dillo users if they're
    all pretending to use other browsers?

    I played with that config setting for maybe only an hour. Just for
    curiosity.

    --
    I do not bite, I just want to play.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ivan Shmakov@21:1/5 to All on Thu Jan 23 19:33:29 2025
    On 2025-01-21, Computer Nerd Kev wrote:
    Salvador Mirzo <smirzo@example.com> wrote:
    Sylvia Else <sylvia@email.invalid> writes:

    How is this going to '"better protect" Google Search against
    malicious activity, such as bots and spam'?

    I believe the idea is that if the robot doesn't speak Javascript,
    it's an easy denial by the web server. And making bots speak
    Javascript is one step up. And with Javascript they can likely
    monitor things like mouse movement to detect whether the user
    is a human or a robot.

    Which of course is one of Google's main businesses, with their
    Captchas that don't always need to show a puzzle in order to
    validate users as humans. So if anyone _thinks_ they can achieve
    that, you'd expect it to be Google.

    And they don't even need it to be perfect: a robot that
    implements the relevant browser APIs, while possible, /will/
    be costlier to run and maintain, thus reducing the profits of
    the robot operators, in turn disincentivizing them.

    Even if that doesn't solve the problem altogether, it will
    still likely result in less load for their servers.

    Not that it invalidates any other reasons they might want to
    require Javascript / APIs regardless, mind you.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Sylvia Else@21:1/5 to Ivan Shmakov on Fri Jan 24 13:30:17 2025
    On 24-Jan-25 3:33 am, Ivan Shmakov wrote:
    On 2025-01-21, Computer Nerd Kev wrote:
    Salvador Mirzo <smirzo@example.com> wrote:
    Sylvia Else <sylvia@email.invalid> writes:

    >>> How is this going to '"better protect" Google Search against
    >>> malicious activity, such as bots and spam'?

    >> I believe the idea is that if the robot doesn't speak Javascript,
    >> it's an easy denial by the web server. And making bots speak
    >> Javascript is one step up. And with Javascript they can likely
    >> monitor things like mouse movement to detect whether the user
    >> is a human or a robot.

    > Which of course is one of Google's main businesses, with their
    > Captchas that don't always need to show a puzzle in order to
    > validate users as humans. So if anyone _thinks_ they can achieve
    > that, you'd expect it to be Google.

    And they don't even need it to be perfect: a robot that
    implements the relevant browser APIs, while possible, /will/
    be costlier to run and maintain, thus reducing the profits of
    the robot operators, in turn disincentivizing them.

    Even if that doesn't solve the problem altogether, it will
    still likely result in less load for their servers.

    Not that it invalidates any other reasons they might want to
    require Javascript / APIs regardless, mind you.

    A bot only needs to be able to send the correct data to the server. how difficult that is obviously depends on the details of the Javascript's interactions with the server, but frequent interactions themselves
    create a higher server load.

    One example would be the mouse-movement based human detection. If the
    script just sends a yes/no message to the server, then the bot doesn't
    need to try to emulate a human at all.

    Sylvia.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Scott Dorsey@21:1/5 to not@telling.you.invalid on Sat Jan 25 15:15:44 2025
    Computer Nerd Kev <not@telling.you.invalid> wrote:
    yeti <yeti@tilde.institute> wrote:
    not@telling.you.invalid (Computer Nerd Kev) wrote:

    Before posting I tried a Google search in Dillo and was redirected
    to a page saying:
    "Turn on JavaScript to keep searching"

    Same thing today.

    ~$ grep ELinks .dillo/dillorc
    ## "ELinks/0.18.0 (textmode; Linux 5.10.0-33-amd64 x86_64; 102x36-2)"
    http_user_agent="ELinks/0.18.0 (textmode; Linux)"

    But now nobody knows you're using Dillo in the first place! What
    incentive do website makers have to consider Dillo users if they're
    all pretending to use other browsers?

    I am using lynx, and it is configured to identify as "Laxative Nine" in
    the string, and I have no problem doing google searches. It has not at
    any point got upset at my lack of javascript, unlike so many other sites. --scott

    --
    "C'est un Nagra. C'est suisse, et tres, tres precis."

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Richmond@21:1/5 to Sylvia Else on Mon Jan 27 20:56:15 2025
    Sylvia Else <sylvia@email.invalid> writes:

    On 24-Jan-25 3:33 am, Ivan Shmakov wrote:
    On 2025-01-21, Computer Nerd Kev wrote:
    Salvador Mirzo <smirzo@example.com> wrote:
    Sylvia Else <sylvia@email.invalid> writes:
    >>> How is this going to '"better protect" Google Search against
    >>> malicious activity, such as bots and spam'?
    >> I believe the idea is that if the robot doesn't speak
    Javascript,
    >> it's an easy denial by the web server. And making bots speak
    >> Javascript is one step up. And with Javascript they can likely
    >> monitor things like mouse movement to detect whether the user
    >> is a human or a robot.
    > Which of course is one of Google's main businesses, with their
    > Captchas that don't always need to show a puzzle in order to
    > validate users as humans. So if anyone _thinks_ they can achieve
    > that, you'd expect it to be Google.
    And they don't even need it to be perfect: a robot that
    implements the relevant browser APIs, while possible, /will/
    be costlier to run and maintain, thus reducing the profits of
    the robot operators, in turn disincentivizing them.
    Even if that doesn't solve the problem altogether, it will
    still likely result in less load for their servers.
    Not that it invalidates any other reasons they might want to
    require Javascript / APIs regardless, mind you.

    A bot only needs to be able to send the correct data to the
    server. how difficult that is obviously depends on the details of the Javascript's interactions with the server, but frequent interactions themselves create a higher server load.

    One example would be the mouse-movement based human detection. If the
    script just sends a yes/no message to the server, then the bot doesn't
    need to try to emulate a human at all.

    Sylvia.

    That's useful. I set my Seamonkey user agent string to a Lynx user agent
    string and now google search works without javascript.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Richmond@21:1/5 to Richmond on Mon Jan 27 21:02:54 2025
    Richmond <dnomhcir@gmx.com> writes:

    Sylvia Else <sylvia@email.invalid> writes:

    On 24-Jan-25 3:33 am, Ivan Shmakov wrote:
    On 2025-01-21, Computer Nerd Kev wrote:
    Salvador Mirzo <smirzo@example.com> wrote:
    Sylvia Else <sylvia@email.invalid> writes:
    >>> How is this going to '"better protect" Google Search against
    >>> malicious activity, such as bots and spam'?
    >> I believe the idea is that if the robot doesn't speak
    Javascript,
    >> it's an easy denial by the web server. And making bots speak
    >> Javascript is one step up. And with Javascript they can likely
    >> monitor things like mouse movement to detect whether the user
    >> is a human or a robot.
    > Which of course is one of Google's main businesses, with their
    > Captchas that don't always need to show a puzzle in order to
    > validate users as humans. So if anyone _thinks_ they can achieve
    > that, you'd expect it to be Google.
    And they don't even need it to be perfect: a robot that
    implements the relevant browser APIs, while possible, /will/
    be costlier to run and maintain, thus reducing the profits of
    the robot operators, in turn disincentivizing them.
    Even if that doesn't solve the problem altogether, it will
    still likely result in less load for their servers.
    Not that it invalidates any other reasons they might want to
    require Javascript / APIs regardless, mind you.

    A bot only needs to be able to send the correct data to the
    server. how difficult that is obviously depends on the details of the
    Javascript's interactions with the server, but frequent interactions
    themselves create a higher server load.

    One example would be the mouse-movement based human detection. If the
    script just sends a yes/no message to the server, then the bot doesn't
    need to try to emulate a human at all.

    Sylvia.

    That's useful. I set my Seamonkey user agent string to a Lynx user agent string and now google search works without javascript.

    Sorry, I replied to the wrong article.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From candycanearter07@21:1/5 to Salvador Mirzo on Wed Jan 29 19:00:04 2025
    Salvador Mirzo <smirzo@example.com> wrote at 13:33 this Tuesday (GMT):
    Sylvia Else <sylvia@email.invalid> writes:

    On 21-Jan-25 5:10 am, Computer Nerd Kev wrote:
    Google begins requiring JavaScript for Google Search
    by Thom Holwerda 2025-01-18
    - https://www.osnews.com/story/141570/google-begins-requiring-javascript-for-google-search/
    " Google says it has begun requiring users to turn on JavaScript,
    the
    widely used programming language to make web pages interactive, in
    order to use Google Search.
    In an email to TechCrunch, a company spokesperson claimed that the
    change is intended to "better protect" Google Search against
    malicious activity, such as bots and spam, and to improve the
    overall Google Search experience for users.

    [...]

    How is this going to '"better protect" Google Search against malicious
    activity, such as bots and spam'?

    I believe the idea is that if the robot doesn't speak Javascript, it's
    an easy denial by the web server. And making bots speak Javascript is
    one step up. And with Javascript they can likely monitor things like
    mouse movement to detect whether the user is a human or a robot.

    I'm not approving the idea; just sharing what I think they might have in
    mind when they say Javascript will help them fend off robots.


    It would also make it harder to scrape, since I /think/ web scrapers
    don't run JS by default.
    --
    user <candycane> is generated from /dev/urandom

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Richmond@21:1/5 to Rich on Wed Jan 29 20:04:47 2025
    Rich <rich@example.invalid> writes:

    candycanearter07 <candycanearter07@candycanearter07.nomail.afraid> wrote:
    Salvador Mirzo <smirzo@example.com> wrote at 13:33 this Tuesday
    (GMT): > Sylvia Else <sylvia@email.invalid> writes:

    On 21-Jan-25 5:10 am, Computer Nerd Kev wrote:
    Google begins requiring JavaScript for Google Search by Thom
    Holwerda 2025-01-18 -
    https://www.osnews.com/story/141570/google-begins-requiring-javascript-for-google-search/
    " Google says it has begun requiring users to turn on
    JavaScript, the widely used programming language to make web
    pages interactive, in order to use Google Search. In an email
    to TechCrunch, a company spokesperson claimed that the change is
    intended to "better protect" Google Search against malicious
    activity, such as bots and spam, and to improve the overall
    Google Search experience for users.

    [...]

    How is this going to '"better protect" Google Search against
    malicious activity, such as bots and spam'?

    I believe the idea is that if the robot doesn't speak Javascript,
    it's an easy denial by the web server. And making bots speak
    Javascript is one step up. And with Javascript they can likely
    monitor things like mouse movement to detect whether the user is a
    human or a robot.

    I'm not approving the idea; just sharing what I think they might
    have in mind when they say Javascript will help them fend off
    robots.

    It would also make it harder to scrape, since I /think/ web scrapers
    don't run JS by default.

    Which just means this will push web scrapers to start running JS.

    They don't run JS (yet) because they have not needed to run JS to do
    their scraping. But if JS is required, and they want to scrape bad
    enough, they will put in support for running JS.

    Why can't web scrapers just pretend to be Lynx browsers?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Rich@21:1/5 to candycanearter07@candycanearter07.n on Wed Jan 29 19:33:06 2025
    candycanearter07 <candycanearter07@candycanearter07.nomail.afraid> wrote:
    Salvador Mirzo <smirzo@example.com> wrote at 13:33 this Tuesday (GMT):
    Sylvia Else <sylvia@email.invalid> writes:

    On 21-Jan-25 5:10 am, Computer Nerd Kev wrote:
    Google begins requiring JavaScript for Google Search
    by Thom Holwerda 2025-01-18 -
    https://www.osnews.com/story/141570/google-begins-requiring-javascript-for-google-search/
    " Google says it has begun requiring users to turn on
    JavaScript, the widely used programming language to make web
    pages interactive, in order to use Google Search. In an email
    to TechCrunch, a company spokesperson claimed that the change is
    intended to "better protect" Google Search against malicious
    activity, such as bots and spam, and to improve the overall
    Google Search experience for users.

    [...]

    How is this going to '"better protect" Google Search against
    malicious activity, such as bots and spam'?

    I believe the idea is that if the robot doesn't speak Javascript,
    it's an easy denial by the web server. And making bots speak
    Javascript is one step up. And with Javascript they can likely
    monitor things like mouse movement to detect whether the user is a
    human or a robot.

    I'm not approving the idea; just sharing what I think they might
    have in mind when they say Javascript will help them fend off
    robots.

    It would also make it harder to scrape, since I /think/ web scrapers
    don't run JS by default.

    Which just means this will push web scrapers to start running JS.

    They don't run JS (yet) because they have not needed to run JS to do
    their scraping. But if JS is required, and they want to scrape bad
    enough, they will put in support for running JS.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Scott Dorsey@21:1/5 to dnomhcir@gmx.com on Wed Jan 29 22:15:35 2025
    In article <861pwl5qu8.fsf@example.com>, Richmond <dnomhcir@gmx.com> wrote:

    Why can't web scrapers just pretend to be Lynx browsers?

    Some do. That's why so many web servers refuse connections from Lynx.
    --scott
    --
    "C'est un Nagra. C'est suisse, et tres, tres precis."

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From yeti@21:1/5 to Rich on Thu Jan 30 01:53:15 2025
    Rich <rich@example.invalid> wrote:

    They could, that is until google simply starts expecting JS to be
    executed regardless of the value of the user agent header.

    Then scrapers will add JS to their agents and the users of older
    browsers are the only ones reliably locked out.

    --
    "The government you elect is the government you deserve"
    - Thomas Jefferson

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Rich@21:1/5 to Richmond on Thu Jan 30 00:53:22 2025
    Richmond <dnomhcir@gmx.com> wrote:
    Rich <rich@example.invalid> writes:

    candycanearter07 <candycanearter07@candycanearter07.nomail.afraid> wrote: >>> Salvador Mirzo <smirzo@example.com> wrote at 13:33 this Tuesday
    (GMT): > Sylvia Else <sylvia@email.invalid> writes:

    On 21-Jan-25 5:10 am, Computer Nerd Kev wrote:
    Google begins requiring JavaScript for Google Search by Thom
    Holwerda 2025-01-18 -
    https://www.osnews.com/story/141570/google-begins-requiring-javascript-for-google-search/
    " Google says it has begun requiring users to turn on
    JavaScript, the widely used programming language to make web
    pages interactive, in order to use Google Search. In an email
    to TechCrunch, a company spokesperson claimed that the change is >>>>>> intended to "better protect" Google Search against malicious
    activity, such as bots and spam, and to improve the overall
    Google Search experience for users.

    [...]

    How is this going to '"better protect" Google Search against
    malicious activity, such as bots and spam'?

    I believe the idea is that if the robot doesn't speak Javascript,
    it's an easy denial by the web server. And making bots speak
    Javascript is one step up. And with Javascript they can likely
    monitor things like mouse movement to detect whether the user is a
    human or a robot.

    I'm not approving the idea; just sharing what I think they might
    have in mind when they say Javascript will help them fend off
    robots.

    It would also make it harder to scrape, since I /think/ web scrapers
    don't run JS by default.

    Which just means this will push web scrapers to start running JS.

    They don't run JS (yet) because they have not needed to run JS to do
    their scraping. But if JS is required, and they want to scrape bad
    enough, they will put in support for running JS.

    Why can't web scrapers just pretend to be Lynx browsers?

    They could, that is until google simply starts expecting JS to be
    executed regardless of the value of the user agent header.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Rich@21:1/5 to yeti on Thu Jan 30 03:38:12 2025
    yeti <yeti@tilde.institute> wrote:
    Rich <rich@example.invalid> wrote:

    They could, that is until google simply starts expecting JS to be
    executed regardless of the value of the user agent header.

    Then scrapers will add JS to their agents and the users of older
    browsers are the only ones reliably locked out.

    Which is exactly what I posted four posts back:

    Message-ID: <vndvpi$2h3ut$1@dont-email.me>

    candycanearter07 <candycanearter07@candycanearter07.nomail.afraid> wrote:
    Sylvia Else <sylvia@email.invalid> writes:
    I'm not approving the idea; just sharing what I think they might
    have in mind when they say Javascript will help them fend off
    robots.

    It would also make it harder to scrape, since I think web scrapers
    don't run JS by default.

    Which just means this will push web scrapers to start running JS.

    They don't run JS (yet) because they have not needed to run JS to do
    their scraping. But if JS is required, and they want to scrape bad
    enough, they will put in support for running JS.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From yeti@21:1/5 to Rich on Thu Jan 30 04:43:38 2025
    Rich <rich@example.invalid> wrote:

    yeti <yeti@tilde.institute> wrote:
    Rich <rich@example.invalid> wrote:

    They could, that is until google simply starts expecting JS to be
    executed regardless of the value of the user agent header.

    Then scrapers will add JS to their agents and the users of older
    browsers are the only ones reliably locked out.

    Which is exactly what I posted four posts back:

    Hit shappens!

    And IMO this is no reason to repeat yourself.

    --
    Trump-Fatigue?
    Try ... <https://bobskaradio.com/> ... now!

    \\o o// \o/

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From D@21:1/5 to Scott Dorsey on Thu Jan 30 10:50:30 2025
    On Wed, 29 Jan 2025, Scott Dorsey wrote:

    In article <861pwl5qu8.fsf@example.com>, Richmond <dnomhcir@gmx.com> wrote:

    Why can't web scrapers just pretend to be Lynx browsers?

    Some do. That's why so many web servers refuse connections from Lynx. --scott

    That's racism and illegal! I use elinks and have not had any problems. It
    must be the Trump of text based browsers!

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ivan Shmakov@21:1/5 to All on Thu Jan 30 18:47:28 2025
    On 2025-01-29, Scott Dorsey wrote:
    Richmond <dnomhcir@gmx.com> wrote:

    Why can't web scrapers just pretend to be Lynx browsers?

    Some do. That's why so many web servers refuse connections from Lynx.

    IME it's more common for HTTP servers to react to "libwww" in
    Lynx' User-Agent: rather than "Lynx": removing the former (while
    keeping "Lynx") have often enough resolved the issue for me.

    (These days, I mostly just switch to reading the site via
    http://web.archive.org/ right away, though.)

    Might be because "libwww" is both the name of the library Lynx
    is based on, /and/ the name of an unrelated (AIUI) Perl library
    that, I gather, used to be popular among web robot writers.
    (See, e. g., http://packages.debian.org/sid/libwww-perl .)

    A cursory look over my access.log files seems to hint that Go
    is way more popular a choice for the task these days, though
    my overall impression is that robot authors just use any of
    the popular user agent strings for their software instead of
    anything that might identify their actual codebase.

    Which means that making *any* big decisions based on User-Agent:
    statistics (like, "Look, we're getting lots of hits from
    Arachne users recently; let's optimize our site for their
    best experience at once!") is ill-advised at best: you might
    end up being trolled by a particularly creative botnet operator.

    Personally, as a web author, I try to a. stick to the standards;
    b. have an actual reason for using one feature or another
    (rather than going for "for consistency" or "just because" or
    "this new shiny framework needs it") [*]; and c. mind my audience.

    Sure, I use Lynx a lot for testing, so the webpages I author
    tend to end up being compatible with Lynx, and might be less
    compatible with other UAs. However, the idea that I should
    adapt my practices to the idiosynchrasies of any particular
    UA, regardless of its market share, rubs me the wrong way.
    The "making sure the site works with IE" sort of wrong.

    Conversely, as a reader of that same web, I expect to get a
    standards-compliant document from the site. I deem it my own
    responsibility to make use of it. For instance, I certainly
    won't hold it against the site operator if /my/ software chokes
    on something that /is/ standard.

    What really irks me, though, is when in place of a document,
    I get an application. (Doesn't even matter if it's .js, .exe,
    or .tex.)

    Not that I don't get disappointed on occasion when a website
    "improves" its typography, or switches to a more "mobile-friendly"
    look and feel. But that's one of the major reasons for me to
    stick with Lynx in the first place: go and try to tweak the CSS
    to make your website look more "modern" when viewed with Lynx!

    [*] As a rule, my HTML is expected to comply with the requirements
    of the Live Standard, for both text/html and application/xml+xhtml
    Content-Type:s at the same time (the idea is that if .xhtml does
    not work for someone, the file can be downloaded, renamed to
    .html, and viewed that way.) My CSS should be /mostly/ 2.1
    with some CSS3 Selectors (though I haven't quite checked it.)
    When JavaScript is used (i. e., when I publish an application,
    not just a document), it's ought to conform to ECMA-262 6 (2015),
    though the set of browser APIs used might vary depending on what
    the application aims to do.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Lawrence D'Oliveiro@21:1/5 to All on Mon Feb 24 05:38:09 2025
    On Wed, 29 Jan 2025 19:00:04 -0000 (UTC), candycanearter07 wrote:

    It would also make it harder to scrape, since I /think/ web scrapers
    don't run JS by default.

    I don’t see why it’s so hard to do. Toolkits like PhantomJS and Selenium have been commonplace for years, for precisely this sort of use. They’re
    in the standard Debian repos, so should be available in derivatives
    thereof -- check your distro with an apt-cache search.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)