Preventing bot form submission

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP



Preventing bot form submission



I'm trying to figure out a good way to prevent bots from submitting my form, while keeping the process simple. I've read several great ideas, but I thought about adding a confirm option when the form is submitted. The user clicks submit and a Javascript confirm prompt pops up which requires user interaction.



Would this prevent bots or could a bot figure this out too easy? Below is the code and JSFIddle to demonstrate my idea:



JSFIDDLE


$('button').click(function ()
if(Confirm())
alert('Form submitted');
/* perform a $.post() to php */

else
alert('Form not submitted');

);

function Confirm()
var _question = confirm('Are you sure about this?');
var _response = (_question) ? true : false;
return _response;





If a bot can talk to the server directly, the JavaScript is irrelevant - who says it behaves like a human? There are hidden fields, honey-pot hidden fields, captchas, etc. But if someone really wants to spam your site, they'll just tailor the bot (and I'm sure there is no shortage of sophisticated bot-spam tools or low-wage differentials to exploit). The only way to truly prevent spam is to require authentication - and a way to deal with the spammer, such a blocking or limiting the account.
– user166390
Mar 10 '13 at 7:31






use CAPTCHA in your form
– Alireza41
Mar 10 '13 at 7:33






on hover enable the button,
– ucefkh
Apr 6 '14 at 15:14




9 Answers
9



This is one problem that a lot of people have encountered. As pst points out, the bot can just submit information directly to the server, bypassing the javascript (see simple utilities like cURL and Postman). Many bots are capable of consuming and interacting with the javascript now. Hari krishnan points out the use of captcha, the most prevalent and successful of which (to my knowledge) is reCaptcha. But captchas have their problems and are discouraged by the World-Wide Web compendium, mostly for reasons of ineffectiveness and inaccessibility.



And lest we forget, an attacker can always deploy human intelligence to defeat a captcha. There are stories of attackers paying for people to crack captchas for spamming purposes without the workers realizing they're participating in illegal activities. Amazon offers a service called Mechanical Turk that tackles things like this. Amazon would strenuously object if you were to use their service for malicious purposes, and it has the downside of costing money and creating a paper trail. However, there are more erhm providers out there who would harbor no such objections.



My favorite mechanism is a hidden checkbox. Make it have a label like 'Do you agree to the terms and conditions of using our services?' perhaps even with a link to some serious looking terms. But you default it to unchecked and hide it through css: position it off page, put it in a container with a zero height or zero width, position a div over top of it with a higher z-index. Roll your own mechanism here and be creative.



The secret is that no human will see the checkbox, but most bots fill forms by inspecting the page and manipulating it directly, not through actual vision. Therefore, any form that comes in with that checkbox value set allows you to know it wasn't filled by a human. This technique is called a bot trap. The rule of thumb for the type of auto-form filling bots is that if a human has to intercede to overcome an individual site, then they've lost all the money (in the form of their time) they would have made by spreading their spam advertisements.



(The previous rule of thumb assumes you're protecting a forum or comment form. If actual money or personal information is on the line, then you need more security than just one heuristic. This is still security through obscurity, it just turns out that obscurity is enough to protect you from casual, scripted attacks. Don't deceive yourself into thinking this secures your website against all attacks.)



The other half of the secret is keeping it. Do not alter the response in any way if the box is checked. Show the same confirmation, thank you, or whatever message or page afterwards. That will prevent the bot from knowing it has been rejected.



I am also a fan of the timing method. You have to implement it entirely on the server side. Track the time the page was served in a persistent way (essentially the session) and compare it against the time the form submission comes in. This prevents forgery or even letting the bot know it's being timed - if you make the served time a part of the form or javascript, then you've let them know you're on to them, inviting a more sophisticated approach.



Again though, just silently discard the request while serving the same thank you page (or introduce a delay in responding to the spam form, if you want to be vindictive - this may not keep them from overwhelming your server and it may even let them overwhelm you faster, by keeping more connections open longer. At that point, you need a hardware solution, a firewall on a load balancer setup).



There are a lot of resources out there about delaying server responses to slow down attackers, frequently in the form of brute-force password attempts. This IT Security question looks like a good starting point.



I had been thinking about updating this question for a while regarding the topic of computer vision and form submission. An article surfaced recently that pointed me to this blog post by Steve Hickson, a computer vision enthusiast. Snapchat (apparently some social media platform? I've never used it, feeling older every day...) launched a new captcha-like system where you have to identify pictures (cartoons, really) which contain a ghost. Steve proved that this doesn't verify squat about the submitter, because in typical fashion, computers are better and faster at identifying this simple type of image.



It's not hard to imagine extending a similar approach to other Captcha types. I did a search and found these links interesting as well:



Is reCaptcha broken?
Practical, non-image based Captchas
If we know CAPTCHA can be beat, why are we still using them?
Is there a true alternative to using CAPTCHA images?
How a trio of Hackers brought Google's reCaptcha to its knees - extra interesting because it is about the audio Captchas.



Oh, and we'd hardly be complete without an obligatory XKCD comic.





Wow, thank you for the information. I have read up on ways to prevent bots and most suggest CAPTCHA's, but lately I've been reading people say CAPTCHA's are not going to be around in the near future. THis gives me information that I can research, thank you.
– Mike
Mar 10 '13 at 7:56





I wouldn't say they won't be around in the near future. In my opinion, they have enough of a downside that they're falling out of favor for widespread use. There are plenty of stories (or rants) of Captchas making it way harder for legitimate users and not even stopping 100% of bot traffic. For senstivite applications, a level of hard-ness is acceptable, but if it's a small application or especially one where you benefit more than the user from them completing your form (e.g. feedback, or a business model with heavy competition), Captchas can cause you more problems than they solve.
– Patrick M
Mar 10 '13 at 19:32





In case of the register form, should I apply this measure too? Patrick M, check my profile, please.
– ngungo
Jan 31 '17 at 17:51





In general, yes, you want to protect every input form with some kind of human detection. Registration forms typically require email verification; not for bot detection but to verify that you have some way to contact the user. If it's registration for an email service, well, check out what gmail does when you create a new account (and they have spam detection built into the sending protocol). If it's registration for a public forum, then absolutely use as much bot detection as you can, because (in my experience) that attracts the most bots looking for easy ways to spam.
– Patrick M
Jan 31 '17 at 20:04





I am not a lawyer and this is not legal advice. You may still violate any number of laws for user protections and privacy even if you provide the utmost bot detection.
– Patrick M
Jan 31 '17 at 20:08



Today I successfully stopped a continuous spamming of my form. This method might not always work of course, but it was simple and worked well for this particular case.



I did the following:



I set the action property of the form to mustusejavascript.asp which just shows a message that the submission did not work and that the visitor must have javascript enabled.



I set the form's onsubmit property to a javascript function that sets the action property of the form to the real receiving page, like receivemessage.asp



The bot in question apparently does not handle javascript so I no longer see any spam from it. And for a human (who has javascript turned on) it works without any inconvenience or extra interaction at all. If the visitor has javascript turned off, he will get a clear message about that if he makes a submission.



Your code would not prevent bot submission but its not because of how your code is. The typical bot out there will more likely do an external/automated POST request to the URL (action attribute). The typical bots aren't rendering HTML, CSS, or JavaScript. They are reading the HTML and acting upon them, so any client logic will not be executed. For example, CURLing a URL will get the markup without loading or evaluating any JavaScript. One could create a simple script that looks for <form> and then does a CURL POST to that URL with the matching keys.


action


<form>



With that in mind, a server-side solution to prevent bot submission is necessary. Captcha + CSRF should be suffice. (http://en.wikipedia.org/wiki/Cross-site_request_forgery)





Than you for the information. I never realizes how sophisticated bots can actually be. My thought was, if the user has to interact the bot will not be able to perform it's job. I didn't realize bots can read Javascript and determine the PHP page. Would something like a token work to mitigate false posts?
– Mike
Mar 10 '13 at 8:02



You could simply add captcha to your form. Since captchas will be different and also in images, bots cannot decode that. This is one of the most widely used security for all wesites...


captcha


different


images


decode





I've used them in the past and people complain about the readability. Plus I've read those are going to the wayside.
– Mike
Mar 10 '13 at 7:33





@Mike look into reCaptcha. It's quite popular.
– John Dvorak
Mar 10 '13 at 7:35





@Mike you can make your own images as captcha. Make make simple images, since bots could not identify images that will not be problems.
– Hari krishnan
Mar 10 '13 at 7:40



You could measure the registration time offered no need to fill eternity to text boxes!





A bot could easily forge the time needed to fill in a form. Bots excel at waiting.
– John Dvorak
Mar 10 '13 at 7:36





You're right, but it still minimizes the unwanted accesses.
– Mikatsu
Mar 10 '13 at 7:40



you can not achieve your goal with javascript. because a client can parse your javascript and bypass your methods. You have to do validation on server side via captchas. the main idea is that you store a secret on the server side and validate the form submitted from the client with the secret on the server side.





Just passing a secret won't do; it needs to be encoded in such a way a human can decode that easily, but an automated script can't.
– John Dvorak
Mar 10 '13 at 7:38





and this is what a CAPTCHA is... :)
– oguzhanyalcin
Mar 11 '13 at 0:17



No Realy are you still thinking that Captcha or ReCap are Safe ?



Bots nowDays are smart and can easly recognise Letters on images Using OCR Tools (Search for it to understand)



I say the best way to protect your self from auto Form submitting is adding a hidden hash generated (and stored on the Session on your server of the current Client) every time you display the form for submitting !



That's all when the Bot or any Zombie submit the form you check if it the given hash equals the session stored Hash ;)



for more info Read about CSRF !





CSRF does not prevent bots.. Its for something else, as the shortcut hints
– code ninja
Jan 24 '14 at 17:49





Even if you manna make it hard to a bot add some Javascript and load the form with ajax ;)
– ucefkh
Jan 24 '14 at 22:20





The bot could perform GET to get your CSRF token, then perform many POSTS, as a single token is valid for more than one request by specification. I mean, look at DRM protection and all, the difficulty is proportionate to the time you spend on it (complexity).. DRM is still circumvented no matter what secret sauces you pour into the recipe.
– code ninja
Jan 24 '14 at 23:35





@matejkramny yeah that's good but i don't follow the specification :) i change the token on every request :D, and this what's done by majorityof other web app . ;)
– ucefkh
Apr 6 '14 at 15:08




I ran across a form input validation that prevented programmatic input from registering.



My initial tactic was to grab the element and set it to the Option I wanted. I triggered focus on the input fields and simulated clicks to each element to get the drop downs to show up and then set the value firing the events for changing values. but when I tried to click save the inputs where not registered as having changed.


;failed automation attempt because window doesnt register changes.
;$iUse = _IEGetObjById($nIE,"InternalUseOnly_id")
;_IEAction($iUse,"focus")
;_IEAction($iUse,"click")
;_IEFormElementOptionSelect($iUse,1,1,"byIndex")
;$iEdit = _IEGetObjById($nIE,"canEdit_id")
;_IEAction($iEdit,"focus")
;_IEAction($iEdit,"click")
;_IEFormElementOptionSelect($iEdit,1,1,"byIndex")
;$iTalent = _IEGetObjById($nIE,"TalentReleaseFile_id")
;_IEAction($iTalent,"focus")
;_IEAction($iTalent,"click")
;_IEFormElementOptionSelect($iTalent,2,1,"byIndex")
;Sleep(1000)
;_IEAction(_IETagNameGetCollection($nIE,"button",1),"click")



This caused me to to rethink how input could be entered by directly manipulating the mouse's actions to simulate more selection with mouse type behavior. Needless to say I wont have to manualy upload images 1 by 1 to update product images for companies. used windows number before letters to have my script at end of the directory and when the image upload window pops up I have to use active accessibility to get the syslistview from the window and select the 2nd element which is a picture the 1st element is a folder. or the first element in a findfirstfile return only files call. I use the name to search for the item in a database of items and then access those items and update a few attributes after upload of images,then I move the file from that folder to a another folder so it doesn't get processed again and move onto the next first file in the list and loop until script name is found at the end of the update.



Just sharing how a lowly data entry person saves time, and fights all these evil form validation checks.



Regards.



This is a very short version that hasn't failed since it was implemented on my sites 4 years ago with added variances as needed over time. This can be built up with all the variables and if else statements that you require


function spamChk() {
var ent1 = document.MyForm.Email.value
var str1 = ent1.toLowerCase();
if (str1.includes("noreply"))
document.MyForm.reset();


<input type="text" name="Email" oninput="spamChk()">



I had actually come here today to find out how to redirect particular spam bot IP addresses to H E L L .. just for fun






By clicking "Post Your Answer", you acknowledge that you have read our updated terms of service, privacy policy and cookie policy, and that your continued use of the website is subject to these policies.

Popular posts from this blog

Firebase Auth - with Email and Password - Check user already registered

Dynamically update html content plain JS

How to determine optimal route across keyboard