This is the talk page for discussing improvements to the Deep learning article. This is not a forum for general discussion of the article's subject. |
Article policies
|
Find sources: Google (books · news · scholar · free images · WP refs) · FENS · JSTOR · TWL |
Archives: 1 |
This level-5 vital article is rated C-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | ||||||||||||||||||||||||||||
|
I am considering adding information regarding power consumption. Currently I see two directions: low-power inference for embedded devices[1] and low-power training for huge networks[2]. Please let me know if you know of more credible and/or overview sources discussing the concerns. p.s. This may be a discussion for another page, e.g. ML or ANNs. Cheater no1 (talk) 13:16, 13 November 2020 (UTC)
I plan to attempt a slow motion cleanup of reference spamming. The typical case is an obscure non-notable paper where it does a poor job of supporting the items which cited it, particularly where the material is still cited without it. When all of the references looked equally good/applicable on overcited items, I left them all in. Sincerely, North8000 (talk) 16:29, 7 March 2022 (UTC)
References
((cite journal))
: Check date values in: |date=
(help)
Now transformer base model out perform sequence model like lstm or gru and Google use Transformer base model to translate language 2402:800:6118:555E:10BE:9E4:1CEC:3EDE (talk) 18:15, 28 March 2022 (UTC)
This article was the subject of a Wiki Education Foundation-supported course assignment, between 6 September 2023 and 14 December 2023. Further details are available on the course page. Student editor(s): HELLOEXTRACREDIT (article contribs).
— Assignment last updated by HELLOEXTRACREDIT (talk) 04:30, 26 November 2023 (UTC)