2021-12-10

Example Sentences:

  • Yong Zhuang: Thus, it is highly desirable to represent uncertainty in a trustworthy manner in any AI-based system.source

  • Hefei Qiu: We investigated some properties of GENRE, comparing two variants of our model to BLINK on the ED task (using WNED-KILT validation set): one trained to generate entity names and another to generate numerical identifiers (IDs). source

  • Tianyu Kang: Their low variance can be particularly misleading if one does not recognize that they are biased, as repeated measurements may indicate a low variance, and yet no meaningful conclusion can be drawn because the bias is recommender-dependent, i.e. the value of the bias depends on the recommender algorithm being evaluated.source

  • Zihan Li: Since time is actually continuous, a better approach would be to express dynamics as a set of differential equations and then integrate them from an initial state at t0 to a final state at t1.source

Before & After:

  • Yong Zhuang
    • before: Because of the highly complex, chaotic, highly dynamic, and uncertainties along with the time and space, long-term predictive modeling on big Spatio-temporal data is a major challenge in the big Spatio-temporal analysis field.

    • after: Because of the high complexity, chaos, and uncertainty that come with data distributed in space and time, long-term predictive modeling on big Spatio-temporal data is a significant challenge across many fields of scientific study.

  • Tianyu Kang
    • before: We experiment this on Cifar-10 with VGG19, and the result in table 3 shows within 250 epochs, the model training with shuffled labels have 52% less skill average active rate.

    • after: Besides the theoretical guarantees, we empirically validate our approach on CIFAR-10 with VGG19, and the result at the 250th epochs is shown in table 3. It is worth mentioning that, for this particular experiment, the model training with randomly shuffled labels has 52% less skill average active rate than the baseline, which is in line with theoretical expectations.

  • Zihan Li
    • before: Current symbolic methods have limitations in some situations since they ignore the internal connections between mathematical characters.

    • after: While current token-based symbolic methods are well acceptable, easier to implement and interpret, their results could be unreliable and limited for practical applications due to their isolation of the mathematical internal grammar during the progress.

Paper Presentation Examples: