You need to enable JavaScript to run this app.
最新活动
大模型
产品
解决方案
定价
生态与合作
支持与服务
开发者
了解我们

使用HttpClient请求ST官网datasheet超时,TI官网请求正常,求原因及解决方法

使用HttpClient请求ST官网datasheet超时,TI官网请求正常,求原因及解决方法

这种同代码不同网站结果的问题确实挺磨人的,我之前也碰过类似的厂商官网反爬限制,ST的服务器对非浏览器请求的校验确实比TI要严格不少。我来帮你拆解下可能的原因,再给你针对性的解决办法:

可能的核心原因

  1. 请求头不全,被ST服务器判定为非合法浏览器请求:你的代码只加了User-Agent、Accept这些基础头,但浏览器实际发起请求时会带Referer、Accept-Language、Connection等一堆额外字段,ST的反爬逻辑可能在检查这些。
  2. TLS版本或加密套件不兼容:ST的服务器可能只支持TLS 1.2及以上版本,而.NET默认的HttpClientHandler可能没有启用最新的TLS协议。
  3. 缺少会话Cookie:浏览器访问ST官网时会自动获取并携带会话Cookie,你的代码是无状态请求,没有处理Cookie,被服务器拦截。
  4. 超时时间设置过短:ST的datasheet服务器响应可能比TI慢,10秒的超时可能不够。

针对性解决办法

1. 补全浏览器级别的请求头

把浏览器实际发送的请求头都补进去,这是最容易见效的一步。比如在你的代码里加上这些:

// 补充Referer(设为ST官网首页,模拟从官网跳转过来的请求)
httpClient.DefaultRequestHeaders.Referrer = new Uri("https://www.st.com/");
// 接收语言
httpClient.DefaultRequestHeaders.AcceptLanguage.Add(new StringWithQualityHeaderValue("en-US", 1.0));
httpClient.DefaultRequestHeaders.AcceptLanguage.Add(new StringWithQualityHeaderValue("en", 0.9));
// 保持连接
httpClient.DefaultRequestHeaders.Connection.Add("keep-alive");
// 增加缓存控制(可选)
httpClient.DefaultRequestHeaders.CacheControl = new CacheControlHeaderValue() { NoCache = true };

2. 强制启用最新TLS版本

在创建HttpClientHandler之前,设置全局的TLS协议版本,避免因为加密套件不兼容被拒绝:

// 放在代码最开头,比如Main方法里
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12 | SecurityProtocolType.Tls13;
// 临时跳过证书验证(生产环境请移除,改用合法证书校验)
ServicePointManager.ServerCertificateValidationCallback += (sender, cert, chain, errors) => true;

3. 启用Cookie容器,模拟浏览器会话

让HttpClient自动处理Cookie,这样第一次请求(比如ST首页)获取的Cookie会自动带到后续的datasheet请求里:

HttpClientHandler handler = new HttpClientHandler() {
    AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate,
    AllowAutoRedirect = true,
    CheckCertificateRevocationList = false,
    UseProxy = false,
    // 新增Cookie容器配置
    UseCookies = true,
    CookieContainer = new CookieContainer()
};

4. 适当延长超时时间

把超时时间从10秒改成30秒,给ST服务器足够的响应时间:

httpClient.Timeout = TimeSpan.FromSeconds(30);

5. 模拟完整的浏览器访问流程

先请求ST的首页,获取必要的会话Cookie,再请求datasheet链接,这样更贴近真实浏览器行为:

// 在请求datasheet之前先请求ST首页
await httpClient.GetAsync("https://www.st.com/").ConfigureAwait(false);
// 再发起datasheet请求
var httpResult2 = await httpClient.GetAsync("https://www.st.com/resource/en/datasheet/stlq015.pdf").ConfigureAwait(false);

修改后的完整测试代码

把以上调整整合后的代码示例,你可以直接运行测试:

internal class Program
{
    static void Main(string[] args)
    {
        // 强制启用TLS 1.2/1.3
        ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12 | SecurityProtocolType.Tls13;
        ServicePointManager.ServerCertificateValidationCallback += (sender, cert, chain, errors) => true;

        var task = Xx1();
        task.Wait();
    }

    private static async Task Xx1()
    {
        HttpClientHandler handler = new HttpClientHandler()
        {
            AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate,
            AllowAutoRedirect = true,
            CheckCertificateRevocationList = false,
            UseProxy = false,
            UseCookies = true,
            CookieContainer = new CookieContainer()
        };

        var httpClient = new HttpClient(handler);
        httpClient.DefaultRequestHeaders.UserAgent.ParseAdd("Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.0.0.0 Safari/537.36");
        httpClient.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("*/*"));
        httpClient.DefaultRequestHeaders.AcceptEncoding.Add(new StringWithQualityHeaderValue("gzip"));
        httpClient.DefaultRequestHeaders.AcceptEncoding.Add(new StringWithQualityHeaderValue("deflate"));
        // 补充请求头
        httpClient.DefaultRequestHeaders.Referrer = new Uri("https://www.st.com/");
        httpClient.DefaultRequestHeaders.AcceptLanguage.Add(new StringWithQualityHeaderValue("en-US", 1.0));
        httpClient.DefaultRequestHeaders.AcceptLanguage.Add(new StringWithQualityHeaderValue("en", 0.9));
        httpClient.DefaultRequestHeaders.Connection.Add("keep-alive");
        httpClient.DefaultRequestHeaders.CacheControl = new CacheControlHeaderValue() { NoCache = true };
        // 延长超时
        httpClient.Timeout = TimeSpan.FromSeconds(30);

        try
        {
            var httpResult1 = await httpClient.GetAsync("https://www.ti.com/lit/gpn/TPS723").ConfigureAwait(false);
            httpResult1.EnsureSuccessStatusCode();
            var resultBytes1 = await httpResult1.Content.ReadAsByteArrayAsync().ConfigureAwait(false);
            Console.WriteLine($"TI datasheet length: {resultBytes1.Length}");
        }
        catch (TaskCanceledException taskCanceledException)
        {
            Console.WriteLine("TI Request timed out: " + taskCanceledException.Message);
        }
        catch (Exception exception)
        {
            Console.WriteLine("TI An error occurred: " + exception.Message);
        }

        try
        {
            // 先请求ST首页获取Cookie
            await httpClient.GetAsync("https://www.st.com/").ConfigureAwait(false);
            var httpResult2 = await httpClient.GetAsync("https://www.st.com/resource/en/datasheet/stlq015.pdf").ConfigureAwait(false);
            httpResult2.EnsureSuccessStatusCode();
            var resultBytes2 = await httpResult2.Content.ReadAsByteArrayAsync().ConfigureAwait(false);
            Console.WriteLine($"ST datasheet length: {resultBytes2.Length}");
        }
        catch (TaskCanceledException taskCanceledException)
        {
            Console.WriteLine("ST Request timed out: " + taskCanceledException.Message);
        }
        catch (Exception exception)
        {
            Console.WriteLine("ST An error occurred: " + exception.Message);
        }
    }
}

总结

大概率是请求头不全或者TLS版本的问题,你可以先尝试补全请求头和启用Cookie容器,这两步解决过我遇到的绝大多数ST官网请求问题。如果还是不行,再加上TLS设置和预请求首页的步骤,应该就能正常下载了。

内容来源于stack exchange

火山引擎 最新活动