考生须知:
- 考生答题前,务必将自已的姓名、准考证号用黑色字迹的签字笔或钢笔填写在答题纸上。
- 选择题的答案须用 2B 铅笔将答题纸上对应题目的答案标号涂黑,如要改动须将原填涂处用橡皮擦净。
- 非选择题的答案须用黑色字迹的签字笔或钢笔写在答题纸上相应区域内,作图时可先使用 2B 铅笔,确定后须用黑色字迹的签字笔或钢笔描黑,答案写在本试题卷上无效。
一、选择题(本大题共 12 小题,每小题 2 分,共 24 分。每小题列出的四个备选项中只有一个是符合题目要求的,不选、多选、错选均不得分)
| addEventListener('fetch', function (event) { | |
| event.respondWith(handleRequest(event.request)); | |
| }); | |
| async function handleRequest(request) { | |
| // Only GET requests work with this proxy. | |
| if (request.method !== 'GET') | |
| return new Response(`Method ${request.method} not allowed.`, { | |
| status: 405, | |
| headers: { Allow: 'GET' }, |
| import base64 | |
| import json | |
| import urllib.request | |
| def new_response(body=None, status=200, headers=None): | |
| return { | |
| 'statusCode': status, | |
| 'headers': headers, | |
| 'body': body, | |
| } |
The training task for Transformers, formulated as
Unlike RNNs that process input sequences step by step and maintain hidden states to carry information across steps, Transformers rely on self-attention mechanisms. These mechanisms allow each element in a sequence to dynamically attend to any other elements, thereby capturing dependencies regardless of their positions in the sequence. This feature enables the Transformer model to parallelize training more effectively than RNNs, as it eliminates the need for sequential processing dictated by hidden states.
The absence of hidden states in Transformers re