How to make a HTTP request using Ruby on Rails?
Asked Answered
B

7

258

I would like to take information from another website. Therefore (maybe) I should make a request to that website (in my case a HTTP GET request) and receive the response.

How can I make this in Ruby on Rails?

If it is possible, is it a correct approach to use in my controllers?

Budbudapest answered 2/1, 2011 at 23:19 Comment(0)
P
350

You can use Ruby's Net::HTTP class:

require 'net/http'

url = URI.parse('http://www.example.com/index.html')
req = Net::HTTP::Get.new(url.to_s)
res = Net::HTTP.start(url.host, url.port) {|http|
  http.request(req)
}
puts res.body
Puddling answered 2/1, 2011 at 23:29 Comment(10)
what does the 'req' mean here?Anywise
Looks like this might be a blocking request, would it not?Skees
where to put the api key?Supersensitive
@João Silva How can i set a timeout for my request?Hacksaw
Just adding that the www. shouldn't be necessary, it typically isn't.Cataclinal
What if you get a Errno::ECONNREFUSED: Failed to open TCP connection to :80 error?Trivandrum
@ScottEisenberg, it is a blocking request, you are right. There are a variety of ways to build an async request.Tectonic
This link provides several code snippets for use cases of the Net::HTTP module. Very helpful.Cheatham
I always wondered if www.example.com had anything. Turns out it does!Follow
This doesn’t work in all cases; you really should look at the net/http’s manual and look for how you want to do this for your case.Christiansen
D
120

Net::HTTP is built into Ruby, but let's face it, often it's easier not to use its cumbersome 1980s style and try a higher level alternative:

Disperse answered 2/1, 2011 at 23:39 Comment(7)
Or ActiveResource, which comes with Rails!Debauchee
I would like to caution against doing so as you will add more dependencies to your rails app. More dependencies means more memory consumption and also potentially larger attack surface. Using Net::HTTP is cumbersome but the trade off isn't worth it.Coreen
This should be the accepted answer. Why program when you can just install lots of Gems?Jacobo
Read the comment of Jason Yeo, It's better to avoid a lot dependencies.Reparation
@JasonYeo Strongly disagree. Introducing dependencies means you don't reinvent the wheel, and you benefit from the hard work others have already done. If a gem exists that makes your life easier, there's generally no good reason not to use it.Debauchee
@MarnenLaibow-Koser uhuh. Something something leftpad saga. But anyway, I use HTTParty. I love it. It's super easy to use if I am making a request with a form body. But if I am simply making a GET request, I would rather stick to Net::HTTP. In that case, it's not worth it to include 10MB of dependencies to make a small GET request.Coreen
@JasonYeo The leftpad saga only happened because NPM ran its repository poorly and let the author delete all his packages. Properly managed package repos don’t do that (and anyway, it’s OSS, so you can easily mirror if you want). That’s is, the leftpad saga is not an argument against introducing dependencies in general, but rather against managing the repo poorly. I do agree with your other point, that a big dependency that does way more than you need can be overkill for the value it provides.Debauchee
R
99

OpenURI is the best; it's as simple as

require 'open-uri'
response = open('http://example.com').read
Recessional answered 14/7, 2013 at 17:52 Comment(3)
It's important to warn, that open-uri won't follow redirects.Bassist
@Bassist which is great, prevents malicious redirects like file:///etc/passwdCrayfish
Please note, that it will not close connection. Use https://mcmap.net/q/111250/-how-to-get-the-html-source-of-a-webpage-in-ruby-duplicatePesky
B
90
require 'net/http'
result = Net::HTTP.get(URI.parse('http://www.example.com/about.html'))
# or
result = Net::HTTP.get(URI.parse('http://www.example.com'), '/about.html')
Biestings answered 2/1, 2011 at 23:23 Comment(1)
I don't think URI.parse is necessary. URI('http://www.example.com/') gives the same result.Gent
E
15

I prefer httpclient over Net::HTTP.

client = HTTPClient.new
puts client.get_content('http://www.example.com/index.html')

HTTParty is a good choice if you're making a class that's a client for a service. It's a convenient mixin that gives you 90% of what you need. See how short the Google and Twitter clients are in the examples.

And to answer your second question: no, I wouldn't put this functionality in a controller--I'd use a model instead if possible to encapsulate the particulars (perhaps using HTTParty) and simply call it from the controller.

Extravagate answered 2/1, 2011 at 23:53 Comment(2)
And how is it possible to pass safely parameters in the URL? Eg: http ://www.example.com/index.html?param1=test1&param2=test2. Then I need to read from the other website parameters and prepare the responce. But how can I read parameters?Budbudapest
What do you mean, you need to read the other website's parameters? How would that even be possible? What are you trying to achieve?Debauchee
S
10

Here is the code that works if you are making a REST api call behind a proxy:

require "uri"
require 'net/http'

proxy_host = '<proxy addr>'
proxy_port = '<proxy_port>'
proxy_user = '<username>'
proxy_pass = '<password>'

uri = URI.parse("https://saucelabs.com:80/rest/v1/users/<username>")
proxy = Net::HTTP::Proxy(proxy_host, proxy_port, proxy_user, proxy_pass)

req = Net::HTTP::Get.new(uri.path)
req.basic_auth(<sauce_username>,<sauce_password>)

result = proxy.start(uri.host,uri.port) do |http|
http.request(req)
end

puts result.body
Stalnaker answered 3/8, 2014 at 21:3 Comment(0)
Z
8

My favorite two ways to grab the contents of URLs are either OpenURI or Typhoeus.

OpenURI because it's everywhere, and Typhoeus because it's very flexible and powerful.

Zusman answered 3/1, 2011 at 3:22 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.