Open Access
Issue |
ITM Web Conf.
Volume 12, 2017
The 4th Annual International Conference on Information Technology and Applications (ITA 2017)
|
|
---|---|---|
Article Number | 03043 | |
Number of page(s) | 4 | |
Section | Session 3: Computer | |
DOI | https://doi.org/10.1051/itmconf/20171203043 | |
Published online | 05 September 2017 |
- Bengio Y, Lamblin P, Popovici D, et al. Greedy layer-wise training of deep networks[C]. Proc, of the 20th Annual Conference on Neural Information Processing System. 2006:153–160. [Google Scholar]
- Vincent p, Larochelle H, Bengio Y, Extracting and coin-posing robust features with denoising autoencoders[ C]. Proc. of the 25th International Conference on Machine Learning. 2008:1096–1103. [Google Scholar]
- Bengio Y. Learning deep architectures for Al[J]. Foundations and Trends in Machine Leaning, 2009,2(1):1–127. [Google Scholar]
- Salah R, Vincent P, Muller X, et al. Contractive autpencoders: Explicit invariance during feature extraction[C]. Proc. of the 28th International Conference on Machine Learning.2011:833–840. [EDP Sciences] [Google Scholar]
- J. Clerk Maxwell, A Treatise on Electricity and Magnetism, 3rd ed., vol. 2. Oxford: Clarendon, 1892, pp.68–73. [Google Scholar]
- Amaral T, Silva L M, Alexande L A, et al. Using different cost functions to train stacked auto_encoders[C]. Proc. of the 12th Mexican International Conference on Artificial Intelligence. 2013:114–120. [Google Scholar]
- Guyon I, Dror G, Leaire V, et al. Auto-encoders unsupervised learning and deep architectures[C]. Proc. of the 28th International Conference on Machine Learning. 2012:37–50 [Google Scholar]
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.