-1

How do i to a website using requests library, i watched a lot of tutorials but they seem to have a 302 POST request in their networks tab in inspector. I see a lot of GET requests in my tab when i login. A friend of mine said cookies but i am really a beginner i don't know how to login.

Also, i would like to know the range from which i can use this library or any helpful source of information from where i can begin learning this library.


import requests

r = requests.get("https://example.com")

I want to POST request, the same friend told me that i would require API access of that website to proceed further is it true?

newbietoprog
  • 101
  • 6

3 Answers3

1

Depending on the site that you are trying to log in with it may be necessary to log in via a chrome browser (selenium) and from there extract and save the cookies for further injection and use within the requests module.

To extract cookies from selenium and save them to a json file use:

cookies = driver.get_cookies()
with open('file_you_want_to_save_the_cookies_to.json', 'w') as f:
  json.dump(cookies, f)

To then use these cookies in the request module use:

cookies = {
  'cookie_name' : 'cookie_value'
}
with requests.Session() as s:
  r = s.get(url, headers=headers, cookies=cookies)

I could help further if you mention what site you are trying to do this on

Rxyces
  • 58
  • 7
  • i used selenium before i learned about requests, but the automation becomes a bit more difficult for me to handle so i am trying to use requests. The website belongs to my company i don't own it – newbietoprog Jan 04 '21 at 15:00
  • I am only suggesting to use selenium for logging in and generating a valid session which can then be extracted and used with requests. Some sites, for example, Instagram do not have any post requests to log in that can be viewed from browser so in that case you would use selenium and then use the session from the selenium with requests to further interact with the web page – Rxyces Jan 04 '21 at 15:35
  • i have the cookies would u want me to paste them here? – newbietoprog Jan 04 '21 at 15:36
  • After you have the cookies use what I sent above. The session generated by selenium will be valid so any request sent using them cookies will be treated as if a user had been logged in. This is useful for *after* logging in if you want to communicate with the site further – Rxyces Jan 04 '21 at 16:52
  • I want to ask, do i need to send every request that my browser sents when done manually? or do i need to send some specific requests? – newbietoprog Jan 04 '21 at 18:26
  • depends what you're trying to do but it mostly just the crucial ones so ignore all the Facebook requests and all that. For example, if you wanted to checkout a product on a site. You would look for the add to cart request then the submitting shipping and so on. – Rxyces Jan 04 '21 at 21:23
  • I see, do you have discord? or any social media i want to ask questions SO is not much compatible with chatting i guess – newbietoprog Jan 05 '21 at 08:40
  • 1
    discord is RC#4958 – Rxyces Jan 05 '21 at 12:13
1

Well best practice is to create a little project which can use your library. For requests lib it can accessing some of free APIs on the internet. For example I made this one while ago which is using free API:

##########################################
#                                        #
#              INSULT API                #
#                                        #
##########################################

def insult_api(name):
    params={
        ("who", name)
        }
    response = requests.get(
        f"https://insult.mattbas.org/api/en/insult.json",
        params=params)
    return response.json()

Helpful source is (obvisouly official documentation) basically any youtube video or SO post here. Just look for the thing you want to do. Anyway if you are looking for logging into website without API accesses, you can use selenium libraby.

Leemosh
  • 551
  • 3
  • 9
1

The requests module is extremely powerful if used properly and with the necessary information sent in the requests. First, analyse the network packets via tools like Network tab in Chrome Dev Tools. Then try to replicate the request via requests in python.

Usually, you will need headers, and data sent.

headers = {
    <your headers here>
}

data = <data here> 

req = requests.post("https://www.examplesite.com/api/login", data=data, headers=headers)

Everything should be easily found in network packets, unless it has some sort of security like csrf-tokens etc, which need to be sent along with the login req. In order to do this, you need to send a GET req to get the info, then send a POST req with the info.

If you could provide the site you're trying to use it would be pretty helpful too. Best of luck!

NewCoder
  • 29
  • 8
  • I want to ask, do i need to send every request that my browser sents when done manually? or do i need to send some specific requests? – newbietoprog Jan 04 '21 at 15:09