![]() | Up a level |
This graph maps the connections between all the collaborators of {}'s publications listed on this page.
Each link represents a collaboration on the same publication. The thickness of the link represents the number of collaborations.
Use the mouse wheel or scroll gestures to zoom into the graph.
You can click on the nodes and links to highlight them and move the nodes by dragging them.
Hold down the "Ctrl" key or the "⌘" key while clicking on the nodes to open the list of this person's publications.
A word cloud is a visual representation of the most frequently used words in a text or a set of texts. The words appear in different sizes, with the size of each word being proportional to its frequency of occurrence in the text. The more frequently a word is used, the larger it appears in the word cloud. This technique allows for a quick visualization of the most important themes and concepts in a text.
In the context of this page, the word cloud was generated from the publications of the author {}. The words in this cloud come from the titles, abstracts, and keywords of the author's articles and research papers. By analyzing this word cloud, you can get an overview of the most recurring and significant topics and research areas in the author's work.
The word cloud is a useful tool for identifying trends and main themes in a corpus of texts, thus facilitating the understanding and analysis of content in a visual and intuitive way.
Ding, Z., Tang, Y., Cheng, X., Li, H., & Shang, W. (2024). LoGenText-Plus : Improving Neural Machine Translation Based Logging Texts Generation with Syntactic Templates. ACM Transactions on Software Engineering and Methodology, 33(2), 38 (45 pages). External link
Chen, J., Ding, Z., Tang, Y., Sayagh, M., Li, H., Adams, B., & Shang, W. (2023, December). IoPV : on inconsistent option performance variations [Paper]. 2023 ESEC/FSE Conferences, San Francisco, CA, USA (13 pages). External link
Ding, Z., Li, H., Shang, W., & Chen, T.-H. P. (2023). Towards Learning Generalizable Code Embeddings Using Task-agnostic Graph Convolutional Networks. ACM Transactions on Software Engineering and Methodology, 32(2), 1-43. External link
Ding, Z., Li, H., Shang, W., & Chen, T.-H. P. (2022). Can pre-trained code embeddings improve model performance? Revisiting the use of code embeddings in software engineering tasks. Empirical Software Engineering, 27(3), 38 pages. External link
Ding, Z., Li, H., & Shang, W. (2022, March). LoGenText: Automatically generating logging texts using neural machine translation [Paper]. IEEE International Conference on Software Analysis, Evolution and Reengineering (SANER 2022), Honolulu, HI, USA. External link