1. The document discusses various statistical and neural network-based models for representing words and modeling semantics, including LSI, PLSI, LDA, word2vec, and neural network language models.
2. These models represent words based on their distributional properties and contexts using techniques like matrix factorization, probabilistic modeling, and neural networks to learn vector representations.
3. Recent models like word2vec use neural networks to learn word embeddings that capture linguistic regularities and can be used for tasks like analogy-making and machine translation.
The document discusses optimization techniques for deep learning frameworks on Intel CPUs and Fugaku aimed architectures. It introduces oneDNN, a performance library for deep learning operations on Intel CPUs. It discusses issues with C++ implementation, and how just-in-time assembly generation using Xbyak can address these issues by generating optimal code depending on parameters. It also introduces Xbyak_aarch64 for generating optimized code for Fugaku's Scalable Vector Extension instructions.
1. The document discusses various statistical and neural network-based models for representing words and modeling semantics, including LSI, PLSI, LDA, word2vec, and neural network language models.
2. These models represent words based on their distributional properties and contexts using techniques like matrix factorization, probabilistic modeling, and neural networks to learn vector representations.
3. Recent models like word2vec use neural networks to learn word embeddings that capture linguistic regularities and can be used for tasks like analogy-making and machine translation.
The document discusses optimization techniques for deep learning frameworks on Intel CPUs and Fugaku aimed architectures. It introduces oneDNN, a performance library for deep learning operations on Intel CPUs. It discusses issues with C++ implementation, and how just-in-time assembly generation using Xbyak can address these issues by generating optimal code depending on parameters. It also introduces Xbyak_aarch64 for generating optimized code for Fugaku's Scalable Vector Extension instructions.
The document summarizes the Vanderbilt pseudopotential method. It starts from all-electron calculations and defines a pseudo-wavefunction. The pseudopotential V'loc is constructed as the sum of the local potential Vloc, Hartree potential VH, and exchange-correlation potential VXC. Unlike other pseudopotentials which subtract VH and VXC, Vanderbilt's adds them. The method can produce both norm-conserving and ultra-soft pseudopotentials depending on whether the generalized norm-conserving condition is satisfied.
Development of highly accurate pseudopotential method and its application to ...dc1394
The document describes the development of a highly accurate pseudopotential method and its application to calculations of silicene grown on a ZrB2 surface. Key points:
1. A novel pseudopotential method called MBK is developed that can replicate scattering characteristics over a broad energy range, improving upon existing pseudopotentials.
2. Calculations show buckled silicene can stably grow on a ZrB2 surface with strong interaction between the silicene and ZrB2.
3. Band structure calculations match well with experimental data and show orbital splitting from the Dirac cone in silicene due to interactions with the ZrB2 surface.