WebClient
This C# class
downloads files. Found in the System.Net
namespace, it downloads web pages and files. WebClient
is powerful.
Class
notesWebClient
is versatile. It makes it possible to easily download web pages for testing. We often use it in a using
-statement.
Make sure to include the System.Net
namespace. This example creates a new WebClient
object instance and sets its user agent.
WebClient
will download a page and the server will think it is Internet Explorer 6. It gets a byte
array of data.WebClient
download request by assigning an entry in the Headers collection.WebHeaderCollection
returned by Headers and call the Add, Remove
, Set and Count
methods on it.DownloadData
method will allocate the bytes on the managed heap.using System; using System.Net; // Create web client simulating IE6. using (WebClient client = new WebClient()) { client.Headers["User-Agent"] = "Mozilla/4.0 (Compatible; Windows NT 5.1; MSIE 6.0)"; // Download data. byte[] arr = client.DownloadData("http://www.example.com/"); // Write values. Console.WriteLine(arr.Length); }1256
This example uses two HTTP request headers set on the Headers collection on WebClient
. It then reads in the ResponseHeaders
collection.
string
keys to the string
values you want the headers to be set to.client.ResponseHeaders
collection.null
.using System; using System.Net; class Program { static void Main() { // Create web client. WebClient client = new WebClient(); // Set user agent and also accept-encoding headers. client.Headers["User-Agent"] = "Googlebot/2.1 (+http://www.googlebot.com/bot.html)"; client.Headers["Accept-Encoding"] = "gzip"; // Download data. byte[] arr = client.DownloadData("http://www.dotnetperls.com/"); // Get response header. string contentEncoding = client.ResponseHeaders["Content-Encoding"]; // Write values. Console.WriteLine("--- WebClient result ---"); Console.WriteLine(arr.Length); Console.WriteLine(contentEncoding); } }--- WebClient result --- 2040 gzip
Next, we download a web page from the Internet into a string
. We create a WebClient
and then specify the URL we want to download as the parameter to the DownloadString
method.
string
.DownloadString
method will call into lower-level system routines in the Windows network stack.string
on the managed heap. Then it will return a value referencing that data.using System; using System.Net; class Program { static void Main() { // Create web client. WebClient client = new WebClient(); // Download string. string value = client.DownloadString("http://www.dotnetperls.com/"); // Write values. Console.WriteLine("--- WebClient result ---"); Console.WriteLine(value.Length); Console.WriteLine(value); } }
Console
programThis console program receives the target URL you want to download, and the local file you want to append to. If the local file is not found, it will be created.
Process.Start
method.using System; using System.IO; using System.Net; class Program { static void Main(string[] args) { try { Console.WriteLine("*** Log Append Tool ***"); Console.WriteLine(" Specify file to download, log file"); Console.WriteLine("Downloading: {0}", args[0]); Console.WriteLine("Appending: {0}", args[1]); // Download url. using (WebClient client = new WebClient()) { string value = client.DownloadString(args[0]); // Append url. File.AppendAllText(args[1], string.Format("--- {0} ---\n", DateTime.Now) + value); } } finally { Console.WriteLine("[Done]"); } } }
This program implements a console application that allows you to time a certain web page at any URL. It downloads the web page a certain number of times.
using System; using System.Diagnostics; using System.Net; class Program { const int _max = 5; static void Main(string[] args) { try { // Get url. string url = args[0]; // Report url. Console.ForegroundColor = ConsoleColor.White; Console.WriteLine("... PageTimeTest: times web pages"); Console.ResetColor(); Console.WriteLine("Testing: {0}", url); // Fetch page. using (WebClient client = new WebClient()) { // Set gzip. client.Headers["Accept-Encoding"] = "gzip"; // Download. // ... Do an initial run to prime the cache. byte[] data = client.DownloadData(url); // Start timing. Stopwatch stopwatch = Stopwatch.StartNew(); // Iterate. for (int i = 0; i < Math.Min(100, _max); i++) { data = client.DownloadData(url); } // Stop timing. stopwatch.Stop(); // Report times. Console.WriteLine("Time required: {0} ms", stopwatch.Elapsed.TotalMilliseconds); Console.WriteLine("Time per page: {0} ms", stopwatch.Elapsed.TotalMilliseconds / _max); } } catch (Exception ex) { Console.WriteLine(ex.ToString()); } finally { Console.WriteLine("[Done]"); } } }
You can set the request HTTP headers. You can do this with the Headers get accessor, or the Headers variable as a WebHeaderCollection
.
You can access the response HTTP headers after you invoke DownloadData
or DownloadString
. Headers are found in ResponseHeaders
.
It is possible to access web pages on separate threads. The WebClient
class
provides OpenReadAsync
, DownloadDataAsync
, DownloadFileAsync
and DownloadStringAsync
methods.
void
.Dispose
The WebClient
class
holds onto some system resources which are required to access the network stack in Microsoft Windows. These resources are eventually cleaned up.
Dispose
or use the using
-statement, you can make these resources be cleaned up at more predictable times.We used the WebClient
class
in the System.Net
namespace. This class
allows us to download web pages into strings and byte
arrays. It will fetch external resources.