搜索
  • Health Insurance Market Overview by Advance Technology, Future Outlook 2030

    This report studies the Health Insurance Market with many aspects of the industry like the market size, market status, market trends and forecast, the report also provides brief information of the competitors and the specific growth opportunities with key market drivers. Find the complete Health Insurance Market analysis segmented by companies, region, type and applications in the report.
    The report offers valuable insight into the Health Insurance Market progress and approaches related to the Health Insurance Market with an analysis of each region. The report goes on to talk about the dominant aspects of the market and examine each segment.
    Top Key Players:
    • Aetna
    • Anthem Health Insurance
    • Centene
    • Cigna Corporation
    • HCSC, Highmark Inc.
    • Jubliee Holding Limited
    • Kaiser Permanente Ottonova
    • United Healthcare

    Read the Detailed Index of the Full Research Study @ https://www.datalibraryresearch.com/market-analysis/health-insurance-market-4810

    The Global Health Insurance Market segmented by company, region (country), by Type, and by Application. Players, stakeholders, and other participants in the global Health Insurance Market will be able to gain the upper hand as they use the report as a powerful resource. The segmental analysis focuses on revenue and forecast by region (country), by Type, and by Application for the period 2023-2030.

    Market Segment by Regions, regional analysis covers

    • North America (United States, Canada and Mexico)
    • Europe (Germany, France, UK, Russia and Italy)
    • Asia-Pacific (China, Japan, Korea, India and Southeast Asia)
    • South America (Brazil, Argentina, Colombia, etc.)
    • Middle East and Africa (Saudi Arabia, UAE, Egypt, Nigeria and South Africa)

    Research objectives:
    • To study and analyze the Health Insurance Market size by key regions/countries, product type and application, history data from 2018 to 2020, and forecast to 2030.
    • To understand the structure of Health Insurance Market by identifying its various sub segments.
    • Focuses on the key global Epigenetics players, to define, describe and analyze the value, market share, market competition landscape, SWOT analysis and development plans in next few years.
    • To analyze the Epigenetics with respect to individual growth trends, future prospects, and their contribution to the total market.
    • To share detailed information about the key factors influencing the growth of the market (growth potential, opportunities, drivers, industry-specific challenges and risks).
    • To project the size of Epigenetics submarkets, with respect to key regions (along with their respective key countries).
    • To analyze competitive developments such as expansions, agreements, new product launches and acquisitions in the market.
    • To strategically profile the key players and comprehensively analyze their growth strategies.
    • To strategically profile the key players and comprehensively analyze their growth strategies.

    The report lists the major players in the regions and their respective market share on the basis of global revenue. It also explains their strategic moves in the past few years, investments in product innovation, and changes in leadership to stay ahead in the competition. This will give the reader an edge over others as a well-informed decision can be made looking at the holistic picture of the market.

    Key questions answered in this report
    • What will the market size be in 2030 and what will the growth rate be?
    • What are the key market trends?
    • What is driving this market?
    • What are the challenges to market growth?
    • Who are the key vendors in this market space?
    • What are the market opportunities and threats faced by the key vendors?
    • What are the strengths and weaknesses of the key vendors?

    Table of Contents: Health Insurance Market
    • Part 1: Overview of Health Insurance Market
    • Part 2: Epigenetics Carts: Global Market Status and Forecast by Regions
    • Part 3: Global Market Status and Forecast by Types
    • Part 4: Global Market Status and Forecast by Downstream Industry
    • Part 5: Market Driving Factor Analysis
    • Part 6: Market Competition Status by Major Manufacturers
    • Part 7: Major Manufacturers Introduction and Market Data
    • Part 8: Upstream and Downstream Market Analysis
    • Part 9: Cost and Gross Margin Analysis
    • Part 10: Marketing Status Analysis
    • Part 11: Market Report Conclusion
    • Part 12: Epigenetics: Research Methodology and Reference

    Browse More Reports:
    • Cyber Insurance Market
    • Swine Feed Market
    • Intelligent Process Automation Market
    • Property and Casualty Insurance Market
    About Us: Data Library Research is a market research company that helps to find its passion for helping brands grow, discover, and transform. We want our clients to make wholehearted and long-term business decisions. Data Library Research is committed to delivering its output from market research studies that are based on fact-based and relevant research across the globe. We offer premier market research services that cover all industries verticals, including agro-space defense, agriculture, and food, automotive, basic material, consumer, energy, life science, manufacturing, service, telecom, education, security, technology. We make sure that we make an honest attempt to provide clients an objective strategic insight, which will ultimately result in excellent outcomes.
    Contact Us: Rohit Shrivas,
    Senior Manager International Sales and Marketing
    Data Library Research
    info@datalibraryresearch.com
    Ph: +13608511343 (US)
    Follow Us:
    LINKEDIN | FACEBOOK | TWITTER
    Health Insurance Market Size, Trend & forecast by 2030
    The current valuation of the Health Insurance Market stands at USD 2.01 trillion. It is anticipated to witness a growth rate of 19.5% throughout the forecast by 2030.
    WWW.DATALIBRARYRESEARCH.COM
    0 0 评论 0 股票
    请登录喜欢,分享和评论!
  • Signal intelligence Market – Outlook, Size, Share & Forecast 2030

    What is Signal Intelligence?

    Signal Intelligence (SIGINT) is a type of intelligence gathering that involves the collection and analysis of electronic signals. This can include communications signals, such as radio, radar, and satellite communications, as well as non-communications signals, such as electronic emissions from radars, weapons systems, and other electronic devices.

    SIGINT is a vital source of intelligence for both military and civilian organizations. It can be used to track enemy movements, identify potential threats, and gather information about foreign governments and organizations.

    How is Signal Intelligence Collected?

    SIGINT can be collected in a variety of ways, including:

    Eavesdropping: This involves intercepting electronic signals as they travel through the air or over a physical medium, such as a cable.

    Signals Exploitation: This involves analyzing collected signals to extract information, such as the source of the signal, the time and date of the transmission, and the content of the message.

    Signals Intelligence (SIGINT) Fusion: This involves combining SIGINT with other sources of intelligence, such as human intelligence (HUMINT) and imagery intelligence (IMINT), to produce a more complete picture of a target or situation.

    What are the Uses of Signal Intelligence?

    SIGINT can be used for a variety of purposes, including:

    Military Operations: SIGINT can be used to track enemy movements, identify potential threats, and gather information about foreign governments and organizations. This information can be used to plan and execute military operations.

    Law Enforcement: SIGINT can be used to track criminals, identify potential threats, and gather evidence for criminal investigations.

    National Security: SIGINT can be used to protect national security by identifying and tracking potential threats, such as terrorist attacks and cyberattacks.

    Commercial Applications: SIGINT can also be used for commercial purposes, such as tracking shipping movements, monitoring financial transactions, and protecting intellectual property.

    Browse In-depth Market Research Report (100 Pages) on Signal Intelligence Market

    https://www.marketresearchfuture.com/reports/signal-intelligence-market-7624

    The Future of Signal Intelligence

    The future of SIGINT is likely to be shaped by a number of factors, including the increasing use of encrypted communications, the proliferation of wireless devices, and the growth of the Internet of Things (IoT).

    As communications become more encrypted, SIGINT agencies will need to develop new techniques for collecting and analyzing encrypted signals. The proliferation of wireless devices and the growth of the IoT will create new opportunities for SIGINT collection, but it will also pose new challenges, such as the need to collect and analyze data from a massive number of devices.

    Despite these challenges, SIGINT is likely to remain a vital source of intelligence for both military and civilian organizations in the years to come.

    Conclusion

    Signal Intelligence is a powerful tool that can be used to gather information about a wide range of targets. It is a valuable asset for both military and civilian organizations, and it is likely to become even more important in the future.

    Related Reports

    Real Time Bidding Market - Real Time Bidding Market Size is Projected to Reach USD 19.7 billion at a 19.40% CAGR by 2030:

    Smartphone Operating System Market - Smartphone Operating System Market Projected to Grow at a 20.00% CAGR by 2030:

    Procurement Outsourcing Market - Procurement Outsourcing Market Projected to Hit USD 7.5 billion at a 13.70% CAGR by 2030:
    Signal Intelligence Market Size and Forecast to 2027 | MRFR
    Signal intelligence Market reach a value of USD 23.4 billion by 2030 growing at a CAGR of 5.40% during 2022–2030, Market Segmented by Application and Type
    WWW.MARKETRESEARCHFUTURE.COM
    0 0 评论 0 股票
    请登录喜欢,分享和评论!
  • 范畴论提供了另外的思路:我们可以从对象之间的关联来研究一个对象所具有的性质。一个范畴不是单单把一类对象放在一起就好了,这些对象之间还有许多不同的关系。我们可以把所有的群放在一起构成群的范畴,但群与群之间还通过群同态相关联;我们也可以把所有的拓扑空间放在一个构成拓扑的范畴,但拓扑空间之间还有连续函数相关联。因此,更准确地说,范畴是一类物体和它们之间的关联,或采用更数学的语言——态射——所构成的一个整体。我们可以把所有群构成的范畴记为 Grp,把所有拓扑空间构成的范畴记为 Top。类似的,我们把所有的集合所构成的范畴记为 Set(从数理逻辑的角度,Set 可以理解为你喜欢的公理集合论系统如 ZF, ZFC, ZFC+CH,甚至是 ZF+Atoms 等等的一个模型)。

    从我们原本已有的知识可知,许多定义可以等价的用对象之间的态射来表述。一个子群可以等价地看作一个单的群同态,一个群的商群(或正规子群)可以等价地看作一个满的群同态;正如一个子集可以等价地看作一个单射,一个商集可以等价地用满射来表述。

    可能相比子结构和商结构更难以直接观察到的是,我们遇到的大部分普适的数学构造都可以用对象之间的态射来表述。这里仅举一个例子。

    集合的笛卡尔积被定义为如下的一个集合:


    这里我们定义笛卡尔积的方式和我们所习惯的表述对象的方式一致:通过定义 中的元素来定义这个集合。但它也可以等价地依照集合间的态射给出另一种定义。依据我们现在所定义的笛卡尔积 A x B,可观察到有如下两个很重要的态射:


    这两个态射分别是从 到 和 的投影。在某种意义上,这两个投影映射是所有从某个集合 到 这两个集合态射的二元组中的“最优解”。这句话可能暂时读不通,让我解释地更详细一些:考虑任何一个二元组(f,g) ,其中f:C→A,g:C→B为两个态射。对于任意这样的二元组,我们可以唯一地确定一个新的态射<f,g>:C→A x B 如下:


    且我们观察到,f,g可以用这个新的态射<f,g> 和两个投影映射πA,πB来表达:


    在这个意义上,所有这样的二元组(f,g)都是通过一个唯一从C到A x B的映射复合上投影映射所得出的。在范畴论中,我们可以把这个性质反过来作为A x B的定义!即我们可以定义A x B是这样的一个集合:它具有两个态射πA: A x B→A ,πB: A x B→B,且对于任意的集合C和任意的两个态射f:C→A,g:C→B,有唯一确定的映射<f,g>:C→A x B使得 (1) 成立。在范畴论中我们经常采用如下的交换图来表达这样的性质:


    这样的性质被称为泛性质(universal property)。如果光看上面的定义,我们似乎很难理解泛性质在说些什么。但从直观上来讲,泛性质其实诉说了A x B的功能,或使用方式。为了让读者更好地理解泛性质在干什么,我在这里讲讲泛性质和证明之间非常紧密的关联。

    在数理逻辑中,我们可以用一些证明论的性质来刻画逻辑连词合取(conjunction) 。我们用∧来表示合取,用丨一来表示推出关系,则显然合取满足如下性质:


    上面的规则告诉我们,有一个合取命题时,我们能干些什么。还有另外一个规则告诉我们如何得到一个合取命题:


    (2) 和 (3) 这两个证明论的性质是非常自然的,在数理逻辑中这两条性质称为合取的证明论语义 (proof-theoretic semantics) 。不知道读者此时能不能看出 (2), (3) 这两条证明论语义对合取的刻画,和之前我们讲的笛卡尔积的泛性质之间的关联。

    如果不能的话,让我们重新用文字表述一下。我们可以定义A ∧ B是这样的一个命题:它既能推出 A(A∧B 丨一A),又能推出 B(A∧B丨一B);且对于任意一个命题 C,如果有C丨一A且C丨一B,则C丨一A∧B。如果把这里的推出关系(或者说证明关系)看作态射,我们可以发现对合取的证明论语义描述和定义笛卡尔积的泛性质表述几乎是完全一致的!在证明论语义下一个著名的标语便是 “meaning as use”,即通过对逻辑连词的使用来确定其含义。如果你有这样的直观,相信你应该能更好地理解什么是泛性质。

    我们可以有类似的总结:范畴论是根据功能(或者说与其他对象的关系)来定义和描述一般的对象的。

    此处插一句题外话。上面提到的证明和泛性质之间的关联其实反映了逻辑与范畴论之间更加深层次的对应。在某种特定的意义下(我必须要强调不是在所有意义下),逻辑和范畴是等价的。但这篇初步介绍范畴论思想的文章无法涵盖这更深层次的联系。

    让我们回到主要话题。我们在前面成功地采用泛性质对笛卡尔积进行了刻画,但这里需要进一步说明两点。首先需要强调的是,这里重要的不光是A x B这个集合本身,与其同等重要的是两个投影映射 πA,πB。从A x B到A和B还有很多映射,但显然只有πA,πB这个组合满足上面所陈述的泛性质。因此更准确地说,泛性质所定义的并不单单只是这个集合A x B,而是一个集合加上两个投影映射这个整体。另外更重要的一点,也是贯穿范畴论哲学的一点是:这个泛性质其实并不能唯一地确定唯一的一个集合和两个映射;比如B x A这个集合加上两个投影映射同样也满足如上的泛性质。

    从集合论的角度严格来讲,A x B,B x A是两个不同的集合。这就引出了一个更加重要的问题:从什么意义上我们能说泛性质定义了A x B呢?如果它都无法决定唯一的一个集合,我们在一般意义上并不认为它给出了一个好的定义。范畴论的基本理论给出了对这个问题的答案:如果有两组结构都满足同一个泛性质,则这两个结构必然是同构的!以上面的例子来讲,尽管A x B和B x A是两个不同的集合,但它们之间有一个明显的同构:


    粗略地说,这就是奠定整个范畴论基础的 Yoneda 引理讲述的内容:在一个范畴中,满足同样泛性质的结构之间必定是同构的(这只是对 Yoneda 引理的粗略描述,感兴趣的读者可以自行查找更为精准的表述)!用数学家的语言来说,泛性质在同构的意义上定义了笛卡尔积。

    为什么我们只希望在同构的意义上定义对象?或者更加广泛地,为什么采用泛性质、关注对象的功能和与其他事物之间的联系的思考方式对我们是更有用的?接下来仅仅列举一些对我而言十分重要的原因。
    0 0 评论 0 股票
    请登录喜欢,分享和评论!
  • https://azure.microsoft.com/en-us/blog/announcing-microsoft-s-coco-framework-for-enterprise-blockchain-networks/
    https://opensourcelibs.com/lib/diffnet
    https://github.com/Messi-Q/GraphDeeSmartContract
    https://xscode.com/jdlc105/Must-read-papers-and-continuous-tracking-on-Graph-Neural-Network-GNN-progress
    Must-read papers and continuous track on Graph Neural Network(GNN) progress
    Many important real-world applications and questions come in the form of graphs, such as social network, protein-protein interaction network, brain network, chemical molecular graph and 3D point cloud. Therefore, driven by the above interdisciplinary research, the neural network model for graph data-oriented has become an emerging research hotspot. Among them, two of the three pioneers of deep learning, Professor Yann LeCun (2018 Turing Award Winner), Professor Yoshua Bengio (2018 Turing Award Winner) and famous Professor Jure Leskovec from Stanford University AI lab also participated in it.

    This project focuses on GNN, lists relevant must-read papers and keeps track of progress. We look forward to promoting this direction and providing some help to researchers in this direction.

    Contributed by Allen Bluce (Dr. Bentian Li) and Anne Bluce (Dr. Yunxia Lin), If there is something wrong or GNN-related issue, welcome to send email (Address: jdlc105@qq.com, lbtjackbluce@gmail.com).

    Technology Keyword: Graph Neural Network, Graph convolutional network, Graph network, Graph attention network, Graph auto-encoder, Graph convolutional reinforcement learning, Graph capsule neural network....

    GNN and its variants are an emerging and powerful neural network approach. Its application is no longer limited to the original field. It has flourished in many other areas, such as Data Visualization, Image Processing, NLP, Recommendation System, Computer Vision, Bioinformatics, Chemical informatics, Drug Development and Discovery, Smart Transportation.

    Very hot research topic: the representative work--Graph convolutional networks (GCNs) proposed by T.N. Kipf and M. Welling (ICLR2017 [5] in conference paper list) has been cited 1,020 times in Google Scholar (on 09 May 2019). Update: 1, 065 times (on 20 May 2019); Update: 1, 106 times (on 27 May 2019); Update: 1, 227 times (on 19 June 2019); Update: 1, 377 times (on 8 July 2019); Update: 1, 678 times (on 17 Sept. 2019); Update: 1, 944 times (on 29 Oct. 2019); Update: 2, 232 times (on 9 Dec. 2019); Update: 2, 677 times (on 2 Feb. 2020).Update: 3, 018 times (on 17 March. 2020); Update: 3,560 times (on 27 May. 2020); Update: 4,060 times (on 3 July. 2020); Update: 5,371 times (on 25 Oct. 2020). Update: 6,258 times (on 01 Jan. 2021).

    Thanks for giving us so many stars and supports from the developers and scientists on Github around the world!!! We will continue to make this project better.

    Project Start time: 11 Dec 2018, Latest updated time: 01 Jan. 2021

    New papers about GNN models and their applications have come from NeurIPS2020, AAAI2021 .... We are waiting for more paper to be released.

    Survey papers:
    Bronstein M M, Bruna J, LeCun Y, et al. Geometric deep learning: going beyond euclidean data. IEEE Signal Processing Magazine, 2017, 34(4): 18-42. paper

    Jie Zhou, Ganqu Cui, Zhengyan Zhang, Cheng Yang, Zhiyuan Liu, Maosong Sun, Graph Neural Networks: A Review of Methods and Applications, ArXiv, 2018. paper.

    Battaglia P W, Hamrick J B, Bapst V, et al. Relational inductive biases, deep learning, and graph networks, arXiv 2018. paper

    Zonghan Wu, Shirui Pan, Fengwen Chen, Guodong Long, Chengqi Zhang, Philip S. Yu(Fellow,IEEE), A Comprehensive Survey on Graph Neural Networks, IEEE Transactions on Neural Networks and Learning Systems, 2020. paper.

    Ziwei Zhang, Peng Cui, Wenwu Zhu, Deep Learning on Graphs: A Survey, IEEE Transactions on Knowledge and Data Engineering, 2020. paper.

    Chen Z, Chen F, Zhang L, et al. Bridging the Gap between Spatial and Spectral Domains: A Survey on Graph Neural Networks. arXiv preprint. 2020. paper

    Abadal S, Jain A, Guirado R, et al. Computing Graph Neural Networks: A Survey from Algorithms to Accelerators. arXiv preprint. 2020. paper

    Lamb L, Garcez A, Gori M, et al. Graph Neural Networks Meet Neural-Symbolic Computing: A Survey and Perspective. arXiv preprint. 2020. paper

    Journal papers:
    F. Scarselli, M. Gori, A.C. Tsoi, M. Hagenbuchner, G. Monfardini, The graph neural network model, IEEE Transactions on Neural Networks(IEEE Transactions on Neural Networks and Learning Systems), 2009. paper.

    Scarselli F, Gori M, Tsoi A C, et al. Computational capabilities of graph neural networks, IEEE Transactions on Neural Networks, 2009. paper.

    Micheli A . Neural Network for Graphs: A Contextual Constructive Approach. IEEE Transactions on Neural Networks, 2009. paper.

    Goles, Eric, and Gonzalo A. Ruz. Dynamics of Neural Networks over Undirected Graphs. Neural Networks, 2015. paper.

    Z. Luo, L. Liu, J. Yin, Y. Li, Z. Wu, Deep Learning of Graphs with Ngram Convolutional Neural Networks, IEEE Transactions on Knowledge & Data Engineering, 2017. paper. code.

    Petroski Such F , Sah S , Dominguez M A , et al. Robust Spatial Filtering with Graph Convolutional Neural Networks. IEEE Journal of Selected Topics in Signal Processing, 2017. paper.

    Kawahara J, Brown C J, Miller S P, et al. BrainNetCNN: convolutional neural networks for brain networks; towards predicting neurodevelopment. NeuroImage, 2017. paper.

    Muscoloni A , Thomas J M , Ciucci S , et al. Machine learning meets complex networks via coalescent embedding in the hyperbolic space. Nature Communications, 2017. paper.

    D.M. Camacho, K.M. Collins, R.K. Powers, J.C. Costello, J.J. Collins, Next-Generation Machine Learning for Biological Networks, Cell, 2018. paper.

    Marinka Z , Monica A , Jure L . Modeling polypharmacy side effects with graph convolutional networks. Bioinformatics, 2018. paper.

    Sarah P , Ira K S , Enzo F , et al. Disease Prediction using Graph Convolutional Networks: Application to Autism Spectrum Disorder and Alzheimer’s Disease. Medical Image Analysis, 2018. paper.

    Sofia Ira Ktena, Sarah Parisot, Enzo Ferrante, Martin Rajchl, Matthew Lee, Ben Glocker, Daniel Rueckert, Metric learning with spectral graph convolutions on brain connectivity networks, NeuroImage, 2018. paper.

    Xie T , Grossman J C . Crystal Graph Convolutional Neural Networks for an Accurate and Interpretable Prediction of Material Properties. Physical Review Letters, 2018. paper.

    Phan, Anh Viet, Minh Le Nguyen, Yen Lam Hoang Nguyen, and Lam Thu Bui. DGCNN: A Convolutional Neural Network over Large-Scale Labeled Graphs. Neural Networks, 2018. paper

    Song T, Zheng W, Song P, et al. Eeg emotion recognition using dynamical graph convolutional neural networks. IEEE Transactions on Affective Computing, 2018. paper

    Levie R, Monti F, Bresson X, et al. Cayleynets: Graph convolutional neural networks with complex rational spectral filters. IEEE Transactions on Signal Processing 2019. paper

    Zhang, Zhihong, Dongdong Chen, Jianjia Wang, Lu Bai, and Edwin R. Hancock. Quantum-Based Subgraph Convolutional Neural Networks. Pattern Recognition, 2019. paper

    Qin A, Shang Z, Tian J, et al. Spectral–Spatial Graph Convolutional Networks for Semisupervised Hyperspectral Image Classification. IEEE Geoscience and Remote Sensing Letters, 2019. paper

    Coley C W, Jin W, Rogers L, et al. A graph-convolutional neural network model for the prediction of chemical reactivity. Chemical Science, 2019. paper

    Zhang Z, Chen D, Wang Z, et al. Depth-based Subgraph Convolutional Auto-Encoder for Network Representation Learning. Pattern Recognition, 2019. paper

    Hong Y, Kim J, Chen G, et al. Longitudinal Prediction of Infant Diffusion MRI Data via Graph Convolutional Adversarial Networks. IEEE transactions on medical imaging, 2019. paper

    Khodayar M, Mohammadi S, Khodayar M E, et al. Convolutional Graph Autoencoder: A Generative Deep Neural Network for Probabilistic Spatio-temporal Solar Irradiance Forecasting. IEEE Transactions on Sustainable Energy, 2019. paper

    Zhang Q, Chang J, Meng G, et al. Learning graph structure via graph convolutional networks. Pattern Recognition, 2019. paper

    Xuan P, Pan S, Zhang T, et al. Graph Convolutional Network and Convolutional Neural Network Based Method for Predicting lncRNA-Disease Associations. Cells, 2019. paper

    Sun M, Zhao S, Gilvary C, et al. Graph convolutional networks for computational drug development and discovery. Briefings in bioinformatics, 2019. paper

    Spier N, Nekolla S, Rupprecht C, et al. Classification of Polar Maps from Cardiac Perfusion Imaging with Graph-Convolutional Neural Networks. Scientific reports, 2019. paper

    Heyuan Shi, et al. Hypergraph-Induced Convolutional Networks for Visual Classification. IEEE Transactions on Neural Networks and Learning Systems, 2019. paper

    S.Pan, et al. Learning Graph Embedding With Adversarial Training Methods. IEEE Transactions on Cybernetics, 2019. paper

    D. Grattarola, et al. Change Detection in Graph Streams by Learning Graph Embeddings on Constant-Curvature Manifolds. IEEE Transactions on Neural Networks and Learning Systems. 2019. paper

    Kan Guo, et al. Optimized Graph Convolution Recurrent Neural Network for Traffic Prediction. IEEE Transactions on Intelligent Transportation Systems. 2020. paper

    Ruiz L, et al. Invariance-preserving localized activation functions for graph neural networks. IEEE Transactions on Signal Processing, 2020. paper

    Li J, et al. Neural Inductive Matrix Completion with Graph Convolutional Networks for miRNA-disease Association Prediction. Bioinformatics, 2020. paper

    Bingzhi Chen, et al. Label Co-occurrence Learning with Graph Convolutional Networks for Multi-label Chest X-ray Image Classification. IEEE Journal of Biomedical and Health Informatics, 2020. paper

    Kunjin Chen, et al. Fault Location in Power Distribution Systems via Deep Graph Convolutional Networks. IEEE Journal on Selected Areas in Communications, 2020. paper

    Manessi, Franco, et al. Dynamic graph convolutional networks. Pattern Recognition, 2020. paper

    Jiang X, Zhu R, Li S, et al. Co-embedding of Nodes and Edges with Graph Neural Networks. IEEE Transactions on Pattern Analysis and Machine Intelligence. paper

    Wang Z, Ji S. Second-order pooling for graph neural networks. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2020. paper

    Indro Spinelli, et al. Adaptive Propagation Graph Convolutional Network. IEEE Transactions on Neural Networks and Learning Systems, 2020. paper

    Zhou Fan, et al. Reinforced Spatiotemporal Attentive Graph Neural Networks for Traffic Forecasting. IEEE Internet of Things Journal, 2020. paper

    Wang S H, et al. Covid-19 classification by FGCNet with deep feature fusion from graph convolutional network and convolutional neural network. Information Fusion, 2020. paper

    Ruiz, Luana et al. Gated Graph Recurrent Neural Networks, IEEE Transactions on Signal Processing. paper

    Gama, Fernando et al. Stability Properties of Graph Neural Networks, IEEE Transactions on Signal Processing. paper

    He, Xin et al. MV-GNN: Multi-View Graph Neural Network for Compression Artifacts Reduction, IEEE Transactions on Image Processing. paper

    Conference papers:
    Duvenaud D, Maclaurin D, Aguilera-Iparraguirre J, et al. Convolutional networks on graphs for learning molecular fingerprints, NeurIPS(NIPS) 2015. paper. code.

    M. Niepert, M. Ahmed, K. Kutzkov, Learning Convolutional Neural Networks for Graphs, ICML 2016. paper.

    S. Cao, W. Lu, Q. Xu, Deep neural networks for learning graph representations, AAAI 2016. paper.

    M. Defferrard, X. Bresson, P. Vandergheynst, Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering, NeurIPS(NIPS) 2016. paper. code.

    T.N. Kipf, M. Welling, Semi-Supervised Classification with Graph Convolutional Networks, ICLR 2017. paper. code.

    A. Fout, B. Shariat, J. Byrd, A. Benhur, Protein Interface Prediction using Graph Convolutional Networks, NeurIPS(NIPS) 2017. paper.

    Monti F, Bronstein M, Bresson X. Geometric matrix completion with recurrent multi-graph neural networks, NeurIPS(NIPS) 2017. paper.

    Simonovsky M, Komodakis N. Dynamic edgeconditioned filters in convolutional neural networks on graphs, CVPR. 2017. paper

    R. Li, S. Wang, F. Zhu, J. Huang, Adaptive Graph Convolutional Neural Networks, AAAI 2018. paper

    J. You, B. Liu, R. Ying, V. Pande, J. Leskovec, Graph Convolutional Policy Network for Goal-Directed Molecular Graph Generation, NeurIPS(NIPS) 2018. paper.

    C. Zhuang, Q. Ma, Dual Graph Convolutional Networks for Graph-Based Semi-Supervised Classification, WWW 2018. paper

    H. Gao, Z. Wang, S. Ji, Large-Scale Learnable Graph Convolutional Networks, KDD 2018. paper

    D. Zügner, A. Akbarnejad, S. Günnemann, Adversarial Attacks on Neural Networks for Graph Data, KDD 2018. paper

    Ying R , He R , Chen K , et al. Graph Convolutional Neural Networks for Web-Scale Recommender Systems. KDD 2018. paper

    P. Veličković, G. Cucurull, A. Casanova, A. Romero, P. Liò, Y. Bengio, Graph Attention Networks, ICLR, 2018. paper

    Beck, Daniel Edward Robert, Gholamreza Haffari and Trevor Cohn. Graph-to-Sequence Learning using Gated Graph Neural Networks. ACL 2018. paper

    Yu B, Yin H, Zhu Z. Spatio-temporal graph convolutional networks: A deep learning framework for traffic forecasting. IJCAI 2018. paper

    Chen J , Zhu J , Song L . Stochastic Training of Graph Convolutional Networks with Variance Reduction. ICML 2018. paper

    Gusi Te, Wei Hu, Amin Zheng, Zongming Guo, RGCNN: Regularized Graph CNN for Point Cloud Segmentation. ACM Multimedia 2018. paper, code,

    Talukdar, Partha, Shikhar Vashishth, Shib Sankar Dasgupta and Swayambhu Nath Ray. Dating Documents using Graph Convolution Networks. ACL 2018. paper, code

    Sanchez-Gonzalez A , Heess N , Springenberg J T , et al. Graph networks as learnable physics engines for inference and control. ICML 2018. paper

    Muhan Zhang, Yixin Chen. Link Prediction Based on Graph Neural Networks. NeurIPS(NIPS) 2018. paper

    Chen, Jie, Tengfei Ma, and Cao Xiao. FastGCN: Fast Learning with Graph Convolutional Networks via Importance Sampling. ICLR 2018. paper

    Zhang, Zhen, Hongxia Yang, Jiajun Bu, Sheng Zhou, Pinggang Yu, Jianwei Zhang, Martin Ester, and Can Wang. ANRL: Attributed Network Representation Learning via Deep Neural Networks.. IJCAI 2018. paper

    Rahimi A , Cohn T , Baldwin T . Semi-supervised User Geolocation via Graph Convolutional Networks. ACL 2018. paper

    Morris C , Ritzert M , Fey M , et al.Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks.. AAAI 2019. paper

    Xu K, Hu W, Leskovec J, et al. How Powerful are Graph Neural Networks?, ICLR 2019. paper

    Johannes Klicpera, Aleksandar Bojchevski, Stephan Günnemann. Combining Neural Networks with Personalized PageRank for Classification on Graphs, ICLR 2019. paper

    Daniel Zügner, Stephan Günnemann. Adversarial Attacks on Graph Neural Networks via Meta Learning, ICLR 2019. paper

    Zhang Xinyi, Lihui Chen. Capsule Graph Neural Network, ICLR 2019. paper

    Liao, R., Zhao, Z., Urtasun, R., and Zemel, R. LanczosNet: Multi-Scale Deep Graph Convolutional Networks, ICLR 2019, paper

    Bingbing Xu, Huawei Shen, Qi Cao, Yunqi Qiu, Xueqi Cheng. Graph Wavelet Neural Network, ICLR 2019, paper

    Hu J, Guo C, Yang B, et al. Stochastic Weight Completion for Road Networks using Graph Convolutional Networks ICDE. 2019. paper

    Yao L, Mao C, Luo Y . Graph Convolutional Networks for Text Classification. AAAI 2019. paper

    Landrieu L , Boussaha M . Point Cloud Oversegmentation with Graph-Structured Deep Metric Learning. CVPR 2019. paper

    Si C , Chen W , Wang W , et al. An Attention Enhanced Graph Convolutional LSTM Network for Skeleton-Based Action Recognition. CVPR 2019. paper

    Cucurull G , Taslakian P , Vazquez D . Context-Aware Visual Compatibility Prediction. CVPR 2019. paper

    Jia-Xing Zhong, Nannan Li, Weijie Kong, Shan Liu, Thomas H. Li, Ge Li. Graph Convolutional Label Noise Cleaner: Train a Plug-and-play Action Classifier for Anomaly Detection. CVPR 2019. paper

    Michael Kampffmeyer, Yinbo Chen, Xiaodan Liang, Hao Wang, Yujia Zhang, Eric P. Xing. Rethinking Knowledge Graph Propagation for Zero-Shot Learning. CVPR 2019. paper

    Arushi Goel, Keng Teck Ma, Cheston Tan. An End-to-End Network for Generating Social Relationship Graphs. CVPR 2019. paper

    Yichao Yan, Qiang Zhang, Bingbing Ni, Wendong Zhang, Minghao Xu, Xiaokang Yang. Learning Context Graph for Person Search. CVPR 2019 paper

    Zhongdao Wang, Liang Zheng, Yali Li, Shengjin Wang. Linkage Based Face Clustering via Graph Convolution Network. CVPR 2019 paper

    Lei Yang, Xiaohang Zhan, Dapeng Chen, Junjie Yan, Chen Change Loy, Dahua Lin. Learning to Cluster Faces on an Affinity Graph. CVPR 2019 paper

    Yao Ma, Suhang Wang, Charu C. Aggarwal, Jiliang Tang. Graph Convolutional Networks with EigenPooling. KDD2019, paper

    Wenqi Fan, Yao Ma, Qing Li, Yuan He, Eric Zhao, Jiliang Tang, Dawei Yin. Graph Neural Networks for Social Recommendation. WWW2019, paper

    Kim J, Kim T, Kim S, et al. Edge-labeling Graph Neural Network for Few-shot Learning. CVPR 2019. paper

    Jessica V. Schrouff, Kai Wohlfahrt, Bruno Marnette, Liam Atkinson. INFERRING JAVASCRIPT TYPES USING GRAPH NEURAL NETWORKS. ICLR 2019. paper

    Emanuele Rossi, Federico Monti, Michael Bronstein, Pietro liò. ncRNA Classification with Graph Convolutional Networks. SIGKDD 2019. paper

    Wu F, Zhang T, Souza Jr A H, et al. Simplifying Graph Convolutional Networks. ICML 2019. paper.

    Junhyun Lee, Inyeop Lee, Jaewoo Kang. Self-Attention Graph Pooling. ICML 2019. paper.

    Chiang W L, Liu X, Si S, et al. Cluster-GCN: An Efficient Algorithm for Training Deep and Large Graph Convolutional Networks. SIGKDD 2019. paper.

    Namyong Park, Andrey Kan, Xin Luna Dong, Tong Zhao, Christos Faloutsos, Estimating Node Importance in Knowledge Graphs Using Graph Neural Networks. SIGKDD 2019. paper.

    Wu S, Tang Y, Zhu Y, et al. Session-based Recommendation with Graph Neural Networks. AAAI 2019. paper.

    Qu M, Bengio Y, Tang J. GMNN: Graph Markov Neural Networks. ICML 2019. papercoder.

    Li Y, Gu C, Dullien T, et al. Graph Matching Networks for Learning the Similarity of Graph Structured Objects, ICML 2019.paper.

    Gao H, Ji S. Graph U-Nets, ICML 2019. paper.

    Bojchevski A, Günnemann S. Adversarial Attacks on Node Embeddings via Graph Poisoning, ICML 2019. paper.

    Jeong D, Kwon T, Kim Y, et al. Graph Neural Network for Music Score Data and Modeling Expressive Piano Performance. ICML 2019. paper.

    Zhang G, He H, Katabi D. Circuit-GNN: Graph Neural Networks for Distributed Circuit Design. ICML 2019. paper.

    Alet F, Jeewajee A K, Bauza M, et al. Graph Element Networks: adaptive, structured computation and memory, ICML 2019. paper.

    Rieck B, Bock C, Borgwardt K. A Persistent Weisfeiler-Lehman Procedure for Graph Classification, ICML 2019. paper.

    Walker I, Glocker B. Graph Convolutional Gaussian Processes,ICML 2019. paper.

    Yu Y, Chen J, Gao T, et al. DAG-GNN: DAG Structure Learning with Graph Neural Networks, ICML 2019. paper.

    Zhijiang Guo, Yan Zhang and Wei Lu, Attention Guided Graph Convolutional Networks for Relation Extraction ACL 2019. paper. coder.

    Chang Li, Dan Goldwasser. Encoding Social Information with Graph Convolutional Networks for Political Perspective Detection in News Media ACL 2019. paper.

    Hao Zhu, Yankai Lin, Zhiyuan Liu, Jie Fu, Tat-seng Chua, Maosong Sun. Graph Neural Networks with Generated Parameters for Relation Extraction ACL 2019. paper.

    Shikhar Vashishth, Manik Bhandari, Prateek Yadav, Piyush Rai, Chiranjib Bhattacharyya, Partha Talukdar. Incorporating Syntactic and Semantic Information in Word Embeddings using Graph Convolutional Networks ACL 2019. paper.

    Cui Z, Li Z, Wu S, et al. Dressing as a Whole: Outfit Compatibility Learning Based on Node-wise Graph Neural Networks WWW 2019. paper.

    Zhang, Chris, et al. Graph HyperNetworks for Neural Architecture Search. ICLR 2019. paper.

    Chen, Zhengdao, et al. Supervised Community Detection with Line Graph Neural Networks. ICLR 2019. paper.

    Maron, Haggai, et al. Invariant and Equivariant Graph Networks. ICLR 2019. paper.

    Gulcehre, Caglar, et al. Hyperbolic Attention Networks. ICLR, 2019. paper.

    Prates, Marcelo O. R., et al. Learning to Solve NP-Complete Problems -- A Graph Neural Network for the Decision TSP. AAAI, 2019. paper.

    Liu, Ziqi, et al. GeniePath: Graph Neural Networks with Adaptive Receptive Paths. AAAI, 2019. paper.

    Keriven N, Peyré G. Universal invariant and equivariant graph neural networks. NeurIPS, 2019. paper.

    Qi Liu, et al. Hyperbolic Graph Neural Networks. NeurIPS, 2019. paper.

    Zhitao Ying, et al. GNNExplainer: Generating Explanations for Graph Neural Networks. NeurIPS, 2019. paper.

    Yaqin Zhou, et al. Devign: Effective Vulnerability Identification by Learning Comprehensive Program Semantics via Graph Neural Networks. NeurIPS, 2019. paper.

    Ehsan Hajiramezanali, et al. Variational Graph Recurrent Neural Networks. NeurIPS, 2019. paper.

    Sitao Luan, et al. Break the Ceiling: Stronger Multi-scale Deep Graph Convolutional Networks. NeurIPS, 2019. paper.

    Difan Zou, et al. Layer-Dependent Importance Sampling for Training Deep and Large Graph Convolutional Networks. NeurIPS, 2019. paper.

    Seongjun Yun, et al. Graph Transformer Networks. NeurIPS, 2019. paper.

    Andrei Nicolicioiu, et al. Recurrent Space-time Graph Neural Networks. NeurIPS, 2019. paper.

    Nima Dehmamy, et al. Understanding the Representation Power of Graph Neural Networks in Learning Graph Topology. NeurIPS, 2019. paper.

    Maxime Gasse, et al. Exact Combinatorial Optimization with Graph Convolutional Neural Networks. NeurIPS, 2019. paper.

    Zhengdao Chen, et al. On the equivalence between graph isomorphism testing and function approximation with GNNs. NeurIPS, 2019. paper.

    Vineet Kosaraju, et al. Social-BiGAT: Multimodal Trajectory Forecasting using Bicycle-GAN and Graph Attention Networks. NeurIPS, 2019. paper.

    Carl Yang, et al.Conditional Structure Generation through Graph Variational Generative Adversarial Nets. NeurIPS, 2019. paper.

    Naganand Yadati, et al.HyperGCN: A New Method For Training Graph Convolutional Networks on Hypergraphs. NeurIPS, 2019. paper.

    Haggai Maron, et al.Provably Powerful Graph Networks. NeurIPS, 2019. paper.

    Eliya Nachmani, et al.Hyper-Graph-Network Decoders for Block Codes. NeurIPS, 2019. paper.

    Hanjun Dai, et al.Learning Transferable Graph Exploration. NeurIPS, 2019. paper.

    Ryoma Sato, et al.Approximation Ratios of Graph Neural Networks for Combinatorial Problems. NeurIPS, 2019. paper.

    Boris Knyazev, et al.Understanding Attention and Generalization in Graph Neural Networks. NeurIPS, 2019. paper.

    Renjie Liao, et al.Efficient Graph Generation with Graph Recurrent Attention Networks. NeurIPS, 2019. paper.

    Bryan Wilder, et al.End to end learning and optimization on graphs. NeurIPS, 2019. paper.

    Simon Du, et al.Graph Neural Tangent Kernel: Fusing Graph Neural Networks with Graph Kernels. NeurIPS, 2019. paper.

    W. O. K. Asiri Suranga Wijesinghe, et al. DFNets: Spectral CNNs for Graphs with Feedback-looped Filters. NeurIPS, 2019. paper.

    Dong Wook Shu, et al.3D Point Cloud Generative Adversarial Network Based on Tree Structured Graph Convolutions. ICCV 2019. paper

    Yujun Cai, et al. Exploiting Spatial-temporal Relationships for 3D Pose Estimation via Graph Convolutional Networks. ICCV 2019. paper

    Runhao Zeng, et al. Graph Convolutional Networks for Temporal Action Localization. ICCV 2019. paper

    Yin Bi, et al. Graph-Based Object Classification for Neuromorphic Vision Sensing. ICCV 2019. paper

    103.Tianshui Chen, et al. Learning Semantic-Specific Graph Representation for Multi-Label Image Recognition. ICCV 2019. paper

    Linjie Li, et al. Relation-Aware Graph Attention Network for Visual Question Answering. ICCV 2019. paper

    Jiwoong Park, et al. Symmetric Graph Convolutional Autoencoder for Unsupervised Graph Representation Learning. ICCV 2019. paper

    Runzhong Wang, et al. Learning Combinatorial Embedding Networks for Deep Graph Matching. ICCV 2019. paper

    Zhiqiang Tao, et al. Adversarial Graph Embedding for Ensemble Clustering. IJCAI 2019. paper

    Xiaotong Zhang, et al. Attributed Graph Clustering via Adaptive Graph Convolution. IJCAI 2019. paper

    Jianwen Jiang, et al. Dynamic Hypergraph Neural Networks. IJCAI 2019. paper

    Hogun Park, et al. Exploiting Interaction Links for Node Classification with Deep Graph Neural Networks. IJCAI 2019. paper

    Hao Peng, et al. Fine-grained Event Categorization with Heterogeneous Graph Convolutional Networks. IJCAI 2019. paper

    Chengfeng Xu, et al. Graph Contextualized Self-Attention Network for Session-based Recommendation. IJCAI 2019. paper

    Ruiqing Xu, et al. Graph Convolutional Network Hashing for Cross-Modal Retrieval. IJCAI 2019. paper

    Bingbing Xu, et al. Graph Convolutional Networks using Heat Kernel for Semi-supervised Learning. IJCAI 2019. paper

    Zonghan Wu, et al. Graph WaveNet for Deep Spatial-Temporal Graph Modeling. IJCAI 2019. paper

    Fenyu Hu, et al. Hierarchical Graph Convolutional Networks for Semi-supervised Node Classification. IJCAI 2019. paper

    Li Zheng, et al. AddGraph: Anomaly Detection in Dynamic Graph Using Attention-based Temporal GCN. IJCAI 2019. paper

    Liang Yang, et al. Dual Self-Paced Graph Convolutional Network: Towards Reducing Attribute Distortions Induced by Topology. IJCAI 2019. paper

    Liang Yang, et al. Masked Graph Convolutional Network. IJCAI 2019. paper

    Xiaofeng Xu, et al. Learning Image-Specific Attributes by Hyperbolic Neighborhood Graph Propagation. IJCAI 2019. paper

    Li G, Müller M, Thabet A, et al. Can GCNs Go as Deep as CNNs?. ICCV 2019. paper.

    Park C, Lee C, Bahng H, et al. STGRAT: A Spatio-Temporal Graph Attention Network for Traffic Forecasting. AAAI 2020. paper.

    Liu Y, Wang X, Wu S, et al. Independence Promoted Graph Disentangled Networks. AAAI 2020. paper.

    Shi H, Fan H, Kwok J T. Effective Decoding in Graph Auto-Encoder using Triadic Closure. AAAI 2020. paper.

    Wang X, Wang R, Shi C, et al. Multi-Component Graph Convolutional Collaborative Filtering. AAAI 2020. paper.

    Su J, Beling P A, Guo R, et al. Graph Convolution Networks for Probabilistic Modeling of Driving Acceleration. AAAI 2020. paper.

    Claudio Gallicchio and Alessio Micheli. Fast and Deep Graph Neural Networks. AAAI 2020. paper.

    Peng W, Hong X, Chen H, et al. Learning Graph Convolutional Network for Skeleton-based Human Action Recognition by Neural Searching. AAAI 2020. paper.

    Paliwal A, Loos S, Rabe M, et al. Graph Representations for Higher-Order Logic and Theorem Proving. AAAI 2020. paper.

    Kenta Oono, et al. Graph Neural Networks Exponentially Lose Expressive Power for Node Classification. ICLR 2020. paper.

    Muhan Zhang, et al. Inductive Matrix Completion Based on Graph Neural Networks. ICLR 2020. paper.

    Pablo Barceló, et al. The Logical Expressiveness of Graph Neural Networks. ICLR 2020. paper

    Weihua Hu, et al. Strategies for Pre-training Graph Neural Networks. ICLR 2020. paper

    Hongbin Pei, et al. Geom-GCN: Geometric Graph Convolutional Networks. ICLR 2020. paper

    Ze Ye, et al. Curvature Graph Network. ICLR 2020. paper

    Andreas Loukas, et al. What graph neural networks cannot learn: depth vs width. ICLR 2020. paper

    Federico Errica, et al. A Fair Comparison of Graph Neural Networks for Graph Classification. ICLR 2020. paper

    Kai Zhang, et al. Adaptive Structural Fingerprints for Graph Attention Networks. ICLR 2020. paper

    Shikhar Vashishth, et al. Composition-based Multi-Relational Graph Convolutional Networks. ICLR 2020. paper

    Jiayi Wei, et al. LambdaNet: Probabilistic Type Inference using Graph Neural Networks. ICLR 2020. paper

    Jiechuan Jiang, et al. Graph Convolutional Reinforcement Learning. ICLR 2020. paper

    Yifan Hou, et al. Measuring and Improving the Use of Graph Information in Graph Neural Networks. ICLR 2020. paper

    Ruochi Zhang, et al. Hyper-SAGNN: a self-attention based graph neural network for hypergraphs. ICLR 2020. paper

    Yu Rong, et al. DropEdge: Towards Deep Graph Convolutional Networks on Node Classification. ICLR 2020. paper

    Yuyu Zhang, et al. Efficient Probabilistic Logic Reasoning with Graph Neural Networks. ICLR 2020. paper

    Amir hosein Khasahmadi, et al. Memory-based graph networks. ICLR 2020. paper

    Zeng, Hanqing, et al. GraphSAINT: Graph Sampling Based Inductive Learning Method. ICLR 2020. paper

    Jiangke Lin, et al. Towards High-Fidelity 3D Face Reconstruction from In-the-Wild Images Using Graph Convolutional Networks. CVPR 2020. paper

    Oytun Ulutan, et al. VSGNet: Spatial Attention Network for Detecting Human Object Interactions Using Graph Convolutions. CVPR 2020. paper

    Qiangeng Xu, et al. Grid-GCN for Fast and Scalable Point Cloud Learning. CVPR 2020. paper

    Abduallah Mohamed and Kun Qian, Social-STGCNN: A Social Spatio-Temporal Graph Convolutional Neural Network for Human Trajectory Prediction. CVPR 2020. paper

    Kaihua Zhang, et al. Adaptive Graph Convolutional Network with Attention Graph Clustering for Co-saliency Detection. CVPR 2020. paper

    Jiaming Shen, et al. TaxoExpan: Self-supervised Taxonomy Expansion with Position-Enhanced Graph Neural Network. WWW 2020. paper

    Deyu Bo, et al. Structural Deep Clustering Network. WWW 2020. paper

    Xinyu Fu, et al. MAGNN: Metapath Aggregated Graph Neural Network for Heterogeneous Graph Embedding. WWW 2020. paper

    Man Wu, et al. Unsupervised Domain Adaptive Graph Convolutional Networks. WWW 2020. paper

    Yiwei Sun, et al. Adversarial Attacks on Graph Neural Networks via Node Injections: A Hierarchical Reinforcement Learning Approach. WWW 2020. paper

    Xiaoyang Wang, et al. Traffic Flow Prediction via Spatial Temporal Graph Neural Network. WWW 2020. paper

    Qiaoyu Tan, et al. Learning to Hash with Graph Neural Networks for Recommender Systems. WWW 2020. paper

    Liang Qu, et al. Continuous-Time Link Prediction via Temporal Dependent Graph Neural Network. WWW 2020. paper

    Wei Jin, et al. Graph Structure Learning for Robust Graph Neural Networks. KDD 2020. paper, code.

    Zonghan Wu, et al. Connecting the Dots: Multivariate Time Series Forecasting with Graph Neural Networks. KDD 2020. paper.

    Zhen Yang, et al. Understanding Negative Sampling in Graph Representation Learning. KDD 2020. paper.

    Menghan Wang, et al. M2GRL: A Multi-task Multi-view Graph Representation Learning Framework for Web-scale Recommender Systems. KDD 2020. paper.

    Louis-Pascal A. C. Xhonneux, et al. Continuous Graph Neural Networks. ICML 2020. paper.

    Marc Brockschmidt, et al. GNN-FiLM: Graph Neural Networks with Feature-wise Linear Modulation. ICML 2020. paper to appear.

    Arman Hasanzadeh, et al. Bayesian Graph Neural Networks with Adaptive Connection Sampling. ICML 2020. paper to appear.

    Filipe de Avila Belbute-Peres, et al. Combining Differentiable PDE Solvers and Graph Neural Networks for Fluid Flow Prediction. ICML 2020. paper to appear.

    Ilay Luz, et al. Learning Algebraic Multigrid Using Graph Neural Networks. ICML 2020. paper to appear.

    Vikas K Garg, et al. Generalization and Representational Limits of Graph Neural Networks. ICML 2020. paper to appear.

    Shuai Zhang, et al. Fast Learning of Graph Neural Networks with Guaranteed Generalizability: One-hidden-layer Case. ICML 2020. paper to appear.

    Filippo, et al. Maria BianchiSpectral Clustering with Graph Neural Networks for Graph Pooling. ICML 2020. paper to appear.

    Ming Chen, et al. Simple and Deep Graph Convolutional Networks. ICML 2020. paper to appear.

    Yuning You, et al. When Does Self-Supervision Help Graph Convolutional Networks?. ICML 2020. paper to appear.

    Gregor Bachmann, et al. Constant Curvature Graph Convolutional Networks. ICML 2020. paper to appear.

    Wenhui Yu, et al. Graph Convolutional Network for Recommendation with Low-pass Collaborative Filters. ICML 2020. paper to appear.

    Hongmin Zhu, et al. Bilinear Graph Neural Network with Neighbor Interactions. IJCAI 2020. paper.

    Shuo Zhang, et al. Improving Attention Mechanism in Graph Neural Networks via Cardinality Preservation. IJCAI 2020. paper.

    Kaixiong Zhou, et al. Multi-Channel Graph Neural Networks. IJCAI 2020. paper.

    George Dasoulas, et al. Coloring Graph Neural Networks for Node Disambiguation. IJCAI 2020. paper.

    Xuan Lin, et al. KGNN: Knowledge Graph Neural Network for Drug-Drug Interaction Prediction. IJCAI 2020. paper.

    Yuan Zhuang, et al. Smart Contract Vulnerability Detection using Graph Neural Network. IJCAI 2020. paper.

    Ziyu Jia, et al. GraphSleepNet: Adaptive Spatial-Temporal Graph Convolutional Networks for Sleep Stage Classification. IJCAI 2020. paper.

    Zhichao Huang, et al. MR-GCN: Multi-Relational Graph Convolutional Networks based on Generalized Tensor Product. IJCAI 2020. paper.

    Rongzhou Huang, et al. LSGCN: Long Short-Term Traffic Prediction with Graph Convolutional Networks. IJCAI 2020. paper.

    Min Shi, et al. Multi-Class Imbalanced Graph Convolutional Network Learning. IJCAI 2020. paper.

    Dongxiao He, et al. Community-Centric Graph Convolutional Network for Unsupervised Community Detection. IJCAI 2020. paper.

    Luana Ruiz et al. Graphon Neural Networks and the Transferability of Graph Neural Networks. NeurIPS 2020. paper

    Diego Mesquita et al. Rethinking pooling in graph neural networks. NeurIPS 2020. paper

    Petar Veličković et al. Pointer Graph Networks. NeurIPS 2020. paper

    Andreas Loukas. How hard is to distinguish graphs with graph neural networks?. NeurIPS 2020. paper

    Shangchen Zhou et al. Cross-Scale Internal Graph Neural Network for Image Super-Resolution. NeurIPS 2020. paper

    Jiaqi Ma et al. Towards More Practical Adversarial Attacks on Graph Neural Networks. NeurIPS 2020. paper

    Kaixiong Zhou et al. Towards Deeper Graph Neural Networks with Differentiable Group Normalization. NeurIPS 2020. paper

    Benjamin Sanchez-Lengeling et al. Evaluating Attribution for Graph Neural Networks. NeurIPS 2020. paper

    Ziqi Liu et al. Bandit Samplers for Training Graph Neural Networks. NeurIPS 2020. paper

    Jiong Zhu et al. Beyond Homophily in Graph Neural Networks: Current Limitations and Effective Designs. NeurIPS 2020. paper

    Emily Alsentzer et al. Subgraph Neural Networks. NeurIPS 2020. paper

    Zhen Zhang et al. Factor Graph Neural Networks. NeurIPS 2020. paper

    Xiang Zhang et al. GNNGuard: Defending Graph Neural Networks against Adversarial Attacks. NeurIPS 2020. paper

    Zhengdao Chen et al. Can Graph Neural Networks Count Substructures?. NeurIPS 2020. paper

    Fangda Gu et al. Implicit Graph Neural Networks. NeurIPS 2020. paper

    Minh Vu et al. PGM-Explainer: Probabilistic Graphical Model Explanations for Graph Neural Networks. NeurIPS 2020. paper

    Simon Geisler et al. Reliable Graph Neural Networks via Robust Aggregation. NeurIPS 2020. paper

    Clément Vignac et al. Building powerful and equivariant graph neural networks with structural message-passing. NeurIPS 2020. paper

    Ming Chen et al. Scalable Graph Neural Networks via Bidirectional Propagation. NeurIPS 2020. paper

    Giannis Nikolentzos et al. Random Walk Graph Neural Networks. NeurIPS 2020. paper

    Zheng Ma et al. Path Integral Based Convolution and Pooling for Graph Neural Networks. NeurIPS 2020. paper

    Jiaxuan You et al. Design Space for Graph Neural Networks. NeurIPS 2020. paper

    Defu Cao et al. Spectral Temporal Graph Neural Network for Multivariate Time-series Forecasting. NeurIPS 2020. paper

    Kenta Oono et al. Optimization and Generalization Analysis of Transduction through Gradient Boosting and Application to Multi-scale Graph Neural Networks. NeurIPS 2020. paper

    Yu Chen et al. Iterative Deep Graph Learning for Graph Neural Networks: Better and Robust Node Embeddings. NeurIPS 2020. paper

    Dongsheng Luo et al. Parameterized Explainer for Graph Neural Network. NeurIPS 2020. paper

    Martin Klissarov et al. Reward Propagation Using Graph Convolutional Networks. NeurIPS 2020. paper

    Yimeng Min et al. Scattering GCN: Overcoming Oversmoothness in Graph Convolutional Networks. NeurIPS 2020. paper

    LEI BAI et al. Adaptive Graph Convolutional Recurrent Network for Traffic Forecasting. NeurIPS 2020. paper

    Moshe Eliasof et al. DiffGCN: Graph Convolutional Networks via Differential Operators and Algebraic Multigrid Pooling. NeurIPS 2020. paper

    Pantelis Elinas et al. Variational Inference for Graph Convolutional Networks in the Absence of Graph Data and Adversarial Settings. NeurIPS 2020. paper

    Yiding Yang et al. Factorizable Graph Convolutional Networks. NeurIPS 2020. paper

    Nicolas Keriven et al. Convergence and Stability of Graph Convolutional Networks on Large Random Graphs. NeurIPS 2020. paper

    Chen K, Niu M, Chen Q. A Hierarchical Reasoning Graph Neural Network for The Automatic Scoring of Answer Transcriptions in Video Job Interviews. AAAI 2021. paper

    ArXiv papers:
    Li Y, Tarlow D, Brockschmidt M, et al. Gated graph sequence neural networks. arXiv 2015. paper

    Henaff M, Bruna J, LeCun Y. Deep convolutional networks on graph-structured data, arXiv 2015. paper

    Hechtlinger Y, Chakravarti P, Qin J. A generalization of convolutional neural networks to graph-structured data. arXiv 2017. paper

    Marcheggiani D, Titov I. Encoding sentences with graph convolutional networks for semantic role labeling. arXiv 2017. paper

    Battaglia P W, Hamrick J B, Bapst V, et al. Relational inductive biases, deep learning, and graph networks, arXiv 2018. paper

    Verma S, Zhang Z L. Graph Capsule Convolutional Neural Networks. arXiv 2018. paper

    Zhang T , Zheng W , Cui Z , et al. Tensor graph convolutional neural network. arXiv 2018. paper

    Zou D, Lerman G. Graph Convolutional Neural Networks via Scattering. arXiv 2018. paper

    Du J , Zhang S , Wu G , et al. Topology Adaptive Graph Convolutional Networks. arXiv 2018. paper.

    Shang C , Liu Q , Chen K S , et al. Edge Attention-based Multi-Relational Graph Convolutional Networks. arXiv 2018. paper.

    Scardapane S , Vaerenbergh S V , Comminiello D , et al. Improving Graph Convolutional Networks with Non-Parametric Activation Functions. arXiv 2018. paper.

    Wang Y , Sun Y , Liu Z , et al. Dynamic Graph CNN for Learning on Point Clouds. arXiv 2018. paper.

    Ryu S , Lim J , Hong S H , et al. Deeply learning molecular structure-property relationships using attention- and gate-augmented graph convolutional network. arXiv 2018. paper.

    Cui Z , Henrickson K , Ke R , et al. High-Order Graph Convolutional Recurrent Neural Network: A Deep Learning Framework for Network-Scale Traffic Learning and Forecasting. arXiv 2018. paper.

    Shchur O , Mumme M , Bojchevski A , et al. Pitfalls of Graph Neural Network Evaluation. arXiv 2018. paper.

    Bai Y , Ding H , Bian S , et al. Graph Edit Distance Computation via Graph Neural Networks. arXiv 2018. paper.

    Pedro H. C. Avelar, Henrique Lemos, Marcelo O. R. Prates, Luis Lamb, Multitask Learning on Graph Neural Networks - Learning Multiple Graph Centrality Measures with a Unified Network. arXiv 2018. paper.

    Matthew Baron, Topology and Prediction Focused Research on Graph Convolutional Neural Networks. arXiv 2018. paper.

    Wenting Zhao, Chunyan Xu, Zhen Cui, Tong Zhang, Jiatao Jiang, Zhenyu Zhang, Jian Yang, When Work Matters: Transforming Classical Network Structures to Graph CNN. arXiv 2018. paper.

    Xavier Bresson, Thomas Laurent, Residual Gated Graph ConvNets. arXiv 2018. paper.

    Kun XuLingfei WuZhiguo WangYansong FengVadim Sheinin, Graph2Seq: Graph to Sequence Learning with Attention-based Neural Networks. arXiv 2018. paper.

    Xiaojie GuoLingfei WuLiang Zhao. Deep Graph Translation. arXiv 2018. paper.

    Choma, Nicholas, et al. Graph Neural Networks for IceCube Signal Classification. ArXiv 2018. paper.

    Tyler Derr, Yao Ma, Jiliang Tang. Signed Graph Convolutional Network ArXiv 2018. paper.

    Yawei Luo, Tao Guan, Junqing Yu, Ping Liu, Yi Yang. Every Node Counts: Self-Ensembling Graph Convolutional Networks for Semi-Supervised Learning ArXiv 2018. paper.

    Sun K, Koniusz P, Wang J. Fisher-Bures Adversary Graph Convolutional Networks. arXiv 2019. paper.

    Kazi A, Burwinkel H, Vivar G, et al. InceptionGCN: Receptive Field Aware Graph Convolutional Network for Disease Prediction. arXiv 2019. paper.

    Lemos H, Prates M, Avelar P, et al. Graph Colouring Meets Deep Learning: Effective Graph Neural Network Models for Combinatorial Problems. arXiv 2019. paper.

    Diehl F, Brunner T, Le M T, et al. Graph Neural Networks for Modelling Traffic Participant Interaction. arXiv 2019. paper.

    Murphy R L, Srinivasan B, Rao V, et al. Relational Pooling for Graph Representations. arXiv 2019. paper.

    Zhang W, Shu K, Liu H, et al. Graph Neural Networks for User Identity Linkage. arXiv 2019. paper.

    Ruiz L, Gama F, Ribeiro A. Gated Graph Convolutional Recurrent Neural Networks. arXiv 2019. paper.

    Phillips S, Daniilidis K. All Graphs Lead to Rome: Learning Geometric and Cycle-Consistent Representations with Graph Convolutional Networks. arXiv 2019. paper.

    Hu F, Zhu Y, Wu S, et al. Semi-supervised Node Classification via Hierarchical Graph Convolutional Networks. arXiv 2019. paper.

    Deng Z, Dong Y, Zhu J. Batch Virtual Adversarial Training for Graph Convolutional Networks. arXiv 2019. paper.

    Chen Z M, Wei X S, Wang P, et al.Multi-Label Image Recognition with Graph Convolutional Networks. arXiv 2019. paper.

    Mallea M D G, Meltzer P, Bentley P J. Capsule Neural Networks for Graph Classification using Explicit Tensorial Graph Representations. arXiv 2019. paper.

    Peter Meltzer, Marcelo Daniel Gutierrez Mallea and Peter J. Bentley. PiNet: A Permutation Invariant Graph Neural Network for Graph Classification. arXiv 2019. paper.

    Padraig Corcoran. Function Space Pooling For Graph Convolutional Networks. arXiv 2019. paper.

    Sbastien Lerique, Jacob Levy Abitbol, and Mrton Karsai. Joint embedding of structure and features via graph convolutional networks. arXiv 2019. paper.

    Chen D, Lin Y, Li W, et al. Measuring and Relieving the Over-smoothing Problem for Graph Neural Networks from the Topological View. arXiv 2019. paper

    Ohue M, Ii R, Yanagisawa K, et al. Molecular activity prediction using graph convolutional deep neural network considering distance on a molecular graph. arXiv 2019. paper.

    Gao X, Xiong H, Frossard P. iPool--Information-based Pooling in Hierarchical Graph Neural Networks. arXiv 2019. paper.

    Zhou K, Song Q, Huang X, et al. Auto-GNN: Neural Architecture Search of Graph Neural Networks. arXiv 2019. paper.

    Vijay Prakash Dwivedi, et al. Benchmarking Graph Neural Networks. arXiv 2020. paper.

    Dai Quoc Nguyen, Tu Dinh Nguyen, Dinh Phung. Universal Self-Attention Network for Graph Classification. arXiv 2020. paper

    Open source platform on GNN
    Deep Graph Library(DGL)
    DGL is developed and maintained by New York University, New York University Shanghai, AWS Shanghai Research Institute and AWS MXNet Science Team.

    Initiation time: 2018.

    Source: URL, github

    NGra
    NGra is developed and maintained by Peking University and Microsoft Asia Research Institute.

    Initiation time:2018

    Source: pdf

    Graph_nets
    Graph_nets is developed and maintained by DeepMind, Google Corp.

    Initiation time:2018

    Source: github

    Euler
    Euler is developed and maintained by Alimama, which belongs to Alibaba Group.

    Initiation time:2019

    Source: github

    PyTorch Geometric
    PyTorch Geometric is developed and maintained by TU Dortmund University, Germany.

    Initiation time:2019

    Source: github paper

    PyTorch-BigGraph(PBG)
    PBG is developed and maintained by Facebook AI Research.

    Initiation time:2019

    Source: github paper

    Angel
    Angel is developed and maintained by Tencent Inc.

    Initiation time:2019

    Source: github

    Plato
    Plato is developed and maintained by Tencent Inc.

    Initiation time:2019

    Source: github

    PGL
    PGL is developed and maintained by Baidu Inc.

    Initiation time:2019

    Source: github

    OGB
    Open Graph Benchmark(OGB) is developed and maintained by Standford University.

    Initiation time:2019

    Source: github

    Benchmarking GNNs
    Benchmarking GNNs is developed and maintained by Nanyang Technological University.

    Initiation time:2020

    Source: github

    Graph-Learn
    Graph-Learn is developed and maintained by Alibaba Group.

    Initiation time:2020

    Source: github

    AutoGL (Auto Graph Learning) New
    AutoGL is developed and maintained by Tsinghua University.

    Initiation time:2020

    Source: github

    Appetizer for you:Art Exhibition in the Ultra-High Dimensional Network/Graph Structured Space
    Announcing the Confidential Consortium Blockchain Framework for enterprise blockchain networks
    Microsoft is committed to bringing blockchain to the enterprise—and is working with customers, partners, and the blockchain community to continue advancing its enterprise readiness.
    AZURE.MICROSOFT.COM
    0 0 评论 0 股票
    请登录喜欢,分享和评论!
  • 这是本人从业以来看到最好的一篇技术专业文章,感谢邹老师!
    上段子。。。。。

    这么大晚上的追技术星,你为我是真的技术粉?
    邹老师不断技术刚刚的,人也很nice,在国内技术浮躁的环境中,算是一朵技术奇葩。
    特别佩服的不仅仅是邹老师的技术牛,,,,,那是真的牛,牛到超越你认知水平的阶段(原来以为区块链技术伴随比特币的出生的,谁知道邹老师研究区块链都20多年了!!!!!)

    算了,其实邹老师还有一个超级牛的地方,就是一个伟大的教育家。。。。不说了,。。。。。我这嘴巴。。。。。。打住。。。。。

    文章来源:火星财经


    邹杰:大家好! 我今天主讲的题目是:『Libra技术专业解析』
    第一是背景,为什么要研究Libra?
    首先这是与区块链的困境有关的。到目前为止,区块链除了发币和炒币外,还没有成功的商业模式,而且这两个模式在大多数情况下都是不合规合法的。其他商业模式,比如存证溯源积分发票等应用,都不能挣钱,都不是一个成功的商业模式,尽管是可以去做,但这些都是伪需求。
    Libra是第1次把区块链和数字货币连在一起,不发空气币,只发与法币价值等价的数字货币,并有等量的法币储备来做价格支撑,而且是合规合法的。当然要合规合法,就不可能是一个公链,因为每个节点都需要是合规合法的,这就是为什么选择了一个联盟链,当然,联盟链的选择也是因为在拜占庭将军协议取得了突破性的进展。是拜占庭将军协议能够高效处理上百个节点。与以前的联盟链相比,比如说fabric hyperledger相比是不一样的,因为以前的联盟链是不发币的,是主要作为一个信息的交流,而不是真正价值的交换,没有币是不可能实现价值交换的。合规合法是联盟链的一个特性,当然以前的联盟链通常是用在产业链,比如说供应链金融里面。其实还没有走过合规合法的流程。Libra是第1个发行一个数字货币走合规合法全流程的联盟链。
    尽管Libra是自比特币以来的最大的一个突破,但是,他的愿景和现实的冲突还是巨大的。Libra愿景是为没有银行账户的人群提供银行服务,但是这些没有银行账户的人,他们大多数都生活在当地法币不稳定,或者不能自由兑换的国家和地区,而Libra只能支持稳定的法币,已经发展发达国家,能够自由兑换的法币。所以,Libra他的愿景和现实是有一个根本性上的冲突,这个冲突不是通过技术能够解决的。Libra是一个全球货币,而这些发展中国家,他们的法币不可能是以全球货币的一部分。另外Libra也有它的局限性,至少是现在来看,首先它只支持支付,只支持一个全球数字货币,而且用一个新的MOVE语言。已经有半年了MOVE还没有正式发布,直到现在还是用他的一个中间临时的IR。
    Libra的目标及与以前区块链的关系。首先,他的目标是要突破币圈的局限性。币圈只能发币和炒币这两个真正具有商业价值的应用。在币圈里面的所有发行的币,除了比特币包括比特币现金和以太坊外,一般都不能够用来作为支付手段去购买一个真实的东西。那么第一个超越币圈以外的应用,当然就是支付,这就是Libra专注的应用。当然要支付就涉及到必要的安全性,因为这是真金白银。也是当年中本聪发明区块链和比特币的最主要的想法和应用。支付要用到现实社会当中,就需要一定的效率。比如TPS要足够高,这是以太坊和比特币的局限性和EOS的启示,区块链可以做到足够高的效率,TPS可以上千。另外智能合约也是必要的,而且智能合约的安全性是绝对重要的,这是以太坊的启示。因为在以太坊的智能合约里发现了很多漏洞被攻击,所以很难在以太坊上开发一个完全没有漏洞的智能合约,这对于金融应用是非常致命的。Libra核心就是能够有一个MOVE语言,验证器及虚拟机来保证安全,能够逐步做到formal verification完整的验证。可以基本上保证没有代码层的漏洞。
    下面就开始讲这个里边儿的核心技术,因为我们这次讲座,主要是讲Libra技术。
    首先Libra不是一个区块链,它是一个数据库,是一个能够自我证明的,自然数据的数据库。能自我证明,就是说能够用已有的数据也验证这个数据库的数据是否被修改了,自然数据是说它是随着交易次数不断增加的是不可逆的,不可能回滚的。数据也就是用了这个叫做Sparse Merkle Tree。这也是以太坊用的。

    当其中的一些节点数据是不存在的时候,它能够验证说这些节点上的数据是不存在的。也可以验证这个其他有数据的节点上的数据的正确性。它还有一个其他的特性,就是说它能够支持分片sharding。这个对于扩展数据库是非常重要的。还有就是它支持这个并行计算parallel updates。这也是对提高TPS效率非常重要的。另外它也支持并行计算最初的交易的验证和签字的验证,这也是非常重要的。证明:VRF函数Verify(alpha, f, r, pi) 例子alpha = H(h4||H(H(2||r)||h3))
    当然Libra也试图考虑用AVL Tree。AVL Tree的核心是一个平衡树。

    就是说他试图把这个树的分枝变得更平衡,那么这样储存起来,检索速度最快。下面这张图就是一个AVL Tree表现,你可以看到它是一个非常平衡的树,它没有其中一个枝变得特别长而其他地方没有枝。然后也有人把这个AVL Tree进一步的发展成AVL+ Tree。

    左边是AVL+ Tree树装每个节点的数据。右边是这个树的证明。就是对应每个节点的哈希值。这样当这个节点的数据发生变化的时候,那么右边这个证明的树就会改变,这就是用来证明这个数据是否被修改了。其实在Libra里面用到的这个证明函数就很类似于这个VRF的验证函数,里面包括了一个最初值,它的函数,另外就是它们的计算结果和计算结果的证明。
    还有就是Libra的拜占庭将军协议共识所形成的交易是确定的。比特币共识机制是不一样的,就是所有的POW共识的交易都是不是100%确定的。这也是一个很重要的区别。现在LibraBFT有了一个最新的更新,就是说它这个链可以容许有空块的存在和也有临时的侧链的存在。然后能够达到最后一轮LibraBFT共识的就成为最后的主链。
    另外就是哈希函数和签名,也是用了一个比较新的技术。他的这个签名是用的ed25519 Edwards Curve Digital Signature Algorithm (EdDSA: 2017)。就是2017年的版本,这个版本应该是说是可以说是Ed25519曲线上的最新的一个算法。那么这个算法的特征就是没有怀疑的NSA的后门漏洞。以太坊和比特币用的是P-256曲线被怀疑有NSA的后门漏洞,也就是为什么这个函数将会被用到TLS 1.3版,在这是在2018年定下来的,就是为了防止NSA留下后门漏洞。EdDSA也是美国标准局NIST在2017年推荐给所有美国联邦政府机构使用的算法。还有就是每次的交易,都会有一个多于2/3(3f+1)的节点间签名。对每一个交易最后都形成一个Shnorr签字。区块链是通过每个块的哈希值连接在一起。Libra数据库则是每个交易都有多于2/3(3f+1)的节点签名而且每个交易都有一个序列号及整个数据库都有一个从创始交易开始的总交易数量版本号来防止重复和回滚。
    Block-tree Libra // 区块树的定义
    Block // 区块
    round ; // the round that generated this proposal 轮数
    payload ; // proposed transaction(s) 交易
    parent qc ; // qc for parent block 上轮签字
    id ; // unique digest of round, payload and parent qc:id
    // 上面三个数(轮数,交易,上轮签字)的哈希值
    VoteInfo // 投票信息
    id, round ; // id and round of block
    parent_id, parent_round; // id and round of parent
    exec_state id ; // speculated execution state
    // speculated new committed state to vote directly on
    LedgerCommitInfo // 预计的写入数据库结果
    Commit_state_id ; // nil if no commit happens when this vote is aggregated to QC
    Vote_info_hash ; // hash of VoteMsg:vote info
    VoteMsg // 投票结果
    Vote_info ; // a VoteInfo record
    Ledger_commit_info ; // Speculated ledger info
    sender <- u, signature <- signu(ledger commit info);
    // QC is a VoteMsg with multiple signatures
    QC // 法定人数签字证书
    Vote_info // 投票信息
    Ledger_commit_info // 写入数据库结果
    signatures; // quorum of signatures 签字
    PendingBlkTree ; // tree of blocks pending commitment 预备块的树
    PendingVotes ; // collected votes per block indexed by their LedgerInfo hash 收集的块投票
    High_qc ; // highest known QC 最高的法定人数签字
    Procedure execute_and_insert(b) // 执行和插入
    Execute_state_id <- Ledger:speculate(P:parent_qc:block_id; P:id; P:payload)
    PendingBlkTree:add(b)
    High_qc <- maxround{b:parent_qc; high_qc}
    Procedure process_vote(v) // 投票过程
    vote idx <- hash(v:ledger_commit_info)
    V[vote_idx] <- V[vote_idx] U v:signature
    if |V[vote_idx]| = 2f + 1 then
    qc <- QC<
    vote_info <- v:vote info,
    state_id <- v:state id,
    votes <- V[vote idx]
    >
    Pacemaker.advance_round(qc) high_qc <- maxround{qc; high_qc}
    Function generate_proposal(cmds) // 提出出块建议
    return <
    b:round <- current_round,
    b:payload <- cmds,
    b:parent_qc <- high_qc,
    b:id <- hash(b:round || b:payload || parent_qc:id)
    >
    Function process_commit(id) // 执行数据库更新
    Ledger:commit(id)
    PendingBlkTree:prune(id) // id becomes the new root of pending
    Libra链也包含了一个日志数据库。这个独立的数据库是非常重要的。就是用来写交易输出结果作为验证交易是否成功。如果交易执行了无论成功与否都会写入日志。比如gas费没有了,时间过期了,等等。日志包括这个账号的路径和这个交易的本身以及这个交易的这个序列数。所以这三个在一起就形成了一个唯一的值。这个就是会被用来验证每个交易的有效性。
    Libra也用了哈希3的算法SHA3-256。这个也是以太坊用的但比特币没有用。应该是比sha2-256更好。
    Libra的公识机制也是建立在最新的拜占庭将军协议进展上的HotStuff。以前如果拜占庭将军协议的节点大于30以后,它的效率就会急剧下降。那么最新的进展HotStuff或其他的,可以是达到上百,甚至上千还仍然保持高效率。拜占庭将军协议的突破性进展才可能使联盟链有上百上千上万的节点数。这是以前做不到的,也就是在比如说fabric和hyperledger里面,是做不到的。下面这张图就是它的一个最核心特征:链块。它是通过三轮由不同的出块者来提出需要共识的块。每轮都有签字而且后面的轮的签字都是建立在上一轮上面的。

    每一轮的拜占庭将军协议都是建立在上一轮上的。最后第三轮一起commit写进数据库。图上这个QC就是2/3以上节点的签字证书(法定人数签字证书)。

    每轮都包含了这个QC法定人数签字证书。这是每轮里包含的几个数据。就是它的ID,第几个轮,这个交易本身和这个交易时间及签字。
    这个图就是说如果是其中有一轮没有达成共识,就是说会是因为gas费用完了或者时间过期了,那么他仍然能够通过第二轮第三轮这个回合来形成达成共识。下面图里的K就是Timeout Certificate (TC)就是未完成轮的法定人数签字证书。

    现在的HotStuff拜占庭将军协议的算法上,大概能支撑上百个节点,但是如果要支撑上万个节点的话,那还需要不同的,具有更强扩展性的拜占庭将军协议。BFTree的算法和其中一个。这是一个拜占庭将军协议里的节点树。从这个图当中就会看到,如果是这个有节点没能够达到共识的,那么它会将把这些节点从这个树里面移出去。那么只有形成共识的节点才会留在这个树里面。

    其他还有一种叫做MOCA的共识机制。就像这张图显示的一样,就在这个图当中你会看到一个星号在最右边。就说它的这个节点数可以达到100万,同时它的这个TPS还可以达到上万。,这就是这个共识机制特别的地方。

    还有就是Libra在HotStuff当中没有具体描述和实现的。就是每一步是怎么计算的?每个回合是如何计算的?Libra有一套新的实现,它不是用绝对时间而是用步骤来作为计算,这样就避免了在不同的节点在不同的时间区,会形成不同的时间造成的时间差的问题。
    还有就是MOVE语言。MOVE语言的最大的特性就是实现了资金和逻辑的分离。比如说Libra代币就定义为合约的资源或资产部分。合约的逻辑就是模块部分。在Libra里面资产只能产生,销毁和转移但不能复制。就像一个硬件东西只有一个原件没有复制品。下面是MOVE语言的组成部分
    a) MOVE语言
    ├── README.md # This README 简介
    ├── benchmarks # Benchmarks for the Move language VM and surrounding code
    ├── bytecode-verifier # The bytecode verifier 验证器
    ├── e2e-tests # Infrastructure and tests for the end-to-end flow 端到端测试
    ├── functional_tests # Testing framework for the Move language 功能测试
    ├── compiler # The IR to Move bytecode compiler 编译器
    ├── stdlib # Core Move modules and transaction scripts 标准库
    ├── test.sh # Script for running all the language tests 测试脚本
    └── vm
    ├── cost-synthesis # Cost synthesis for bytecode instructions gas费分析
    ├── src # Bytecode language definitions, serializer, and deserializer代码
    ├── tests # VM tests 虚拟机测试
    ├── vm-genesis # The genesis state creation, and blockchain genesis writeset创世块
    └── vm-runtime # The bytecode interpreter 执行码

    b) MOVE编译器把模块和脚本编译成字节码(开发很费时=低效)(Module & Script)
    USAGE:
    compiler [FLAGS] [OPTIONS] 编译选项
    FLAGS:
    -h, --help Prints help information 帮助
    -l, --list_dependencies Instead of compiling the source, emit a dependency list of the compiled source 依赖库
    -m, --module Treat input file as a module (default is to treat file as a program) 模块
    --no-stdlib Do not automatically compile stdlib dependencies 不编译标准库里的依赖库
    --no-verify Do not automatically run the bytecode verifier 不跑验证器
    -V, --version Prints version information 打印版本号
    OPTIONS:
    -a, --address Account address used for publishing 发布的账户地址
    --deps Path to the list of modules that we want to link with 连接列表
    -o, --output Serialize and write the compiled output to this file输出文件
    ARGS:
    Path to the Move IR source to compile 输出文件夹

    c) MOVE编译器的组成
    compiler # Main compiler crate. This depends on stdlib.
    ├── ir-to-bytecode # Core backend compiler logic, independent of stdlib.
    │ ├── src
    │ │ ├── compiler.rs # Main compiler logic - converts an AST generated by `syntax.rs` to a `CompiledModule` or `CompiledScript`.
    │ │ └── parser.rs # Wrapper around Move IR syntax crate.
    │ └── syntax # Crate containing Move IR syntax.
    │ └── src
    │ ├── ast.rs # Contains all the data structures used to build the AST representing the parsed Move IR input.
    │ ├── syntax.lalrpop # Description of the Move IR language, used by lalrpop to generate a parser.
    └── syntax.rs # Parser generated by lalrpop using the description in `syntax.lalrpop` - a clean checkout won't contain this file.
    └── src
    ├── main.rs # Compiler driver - parses command line options and calls the parser, compiler, and bytecode verifier.
    └── util.rs # Misc compiler utilities.
    这是一个MOVE交易例子。核心是资产Libra coin(import 0x0.LibraCoin;)只能被转账
    // A small variant of the peer-peer payment example that creates a fresh
    // account if one does not already exist.
    import 0x0.LibraAccount;
    import 0x0.LibraCoin;
    main(payee: address, amount: u64) {
    let coin: LibraCoin.T;
    let account_exists: bool;
    // Acquire a LibraCoin.T resource with value `amount` from the sender's
    // account. This will fail if the sender's balance is less than `amount`.
    // 提款
    coin = LibraAccount.withdraw_from_sender(move(amount));
    account_exists = LibraAccount.exists(copy(payee));
    if (!move(account_exists)) {
    // Creates a fresh account at the address `payee` by publishing a
    // LibraAccount.T resource under this address. If theres is already a
    // LibraAccount.T resource under the address, this will fail.
    create_account(copy(payee));
    }
    // 存款
    LibraAccount.deposit(move(payee), move(coin));
    return;
    }
    只有module LibraCoin模块本身才能生成,销毁,合成Libra coin。下面就是对上面Libra账户LibraAccount两个资产转移函数提款和存款的定义(只是用来说明不是真正Libra代码里的)
    // 数字货币
    module Currency {
    // 定义数字货币
    resource Coin { value: u64 }
    // 存款
    public deposit(payee: address, to_deposit: Coin) {
    let to_deposit_value: u64 = Unpack<Coin>(move(to_deposit));
    let coin_ref: &mut Coin = BorrowGlobal<Coin>(move(payee));
    let coin_value_ref: &mut u64 = &mut move(coin_ref).value;
    let coin_value: u64 = *move(coin_value_ref);
    *move(coin_value_ref) = move(coin_value) + move(to_deposit_value);
    }
    // 提款
    public withdraw_from_sender(amount: u64): Coin {
    let transaction_sender_address: address = GetTxnSenderAddress();
    let coin_ref: &mut Coin = BorrowGlobal<Coin>(move(transaction_sender_address));
    let coin_value_ref: &mut u64 = &mut move(coin_ref).value;
    let coin_value: u64 = *move(coin_value_ref);
    RejectUnless(copy(coin_value) >= copy(amount));
    *move(coin_value_ref) = move(coin_value) - copy(amount);
    let new_coin: Coin = Pack<Coin>(move(amount));
    return move(new_coin);
    }
    }
    真正的Libra币LibraCoin在Libra代码里的定义如下:
    https://github.com/libra/libra/blob/master/language/move-lang/stdlib/modules/libra_coin.move
    // Libra币的代码定义
    address 0x0:
    module LibraCoin {
    // 交易
    use 0x0::Transaction;
    // A resource representing the Libra coin
    // The value of the coin. May be zero
    // 币本身
    resource struct T { value: u64 }
    // A singleton resource that grants access to `LibraCoin::mint`. Only the Association has one.
    resource struct MintCapability {}发币权限
    // The sum of the values of all LibraCoin::T resources in the system
    // 发币数量
    resource struct MarketCap { total_value: u64 }
    // Return a reference to the MintCapability published under the sender's account. Fails if the
    // sender does not have a MintCapability.
    // Since only the Association account has a mint capability, this will only succeed if it is
    // invoked by a transaction sent by that account.
    // 发默认币数量
    public mint_with_default_capability(amount: u64): T acquires MintCapability, MarketCap{
    mint(amount, borrow_global<MintCapability>(Transaction::sender()))
    }
    // Mint a new LibraCoin::T worth `value`. The caller must have a reference to a MintCapability.
    // Only the Association account can acquire such a reference, and it can do so only via
    // `borrow_sender_mint_capability`
    // 发新币数量
    public mint(value: u64, capability: &MintCapability): T acquires MarketCap {
    // TODO: temporary measure for testnet only: limit minting to 1B Libra at a time.
    // this is to prevent the market cap's total value from hitting u64_max due to excessive
    // minting. This will not be a problem in the production Libra system because coins will
    // be backed with real-world assets, and thus minting will be correspondingly rarer.
    // * 1000000 because the unit is microlibra
    Transaction::assert(value <= 1000000000 * 1000000, 11);
    // update market cap resource to reflect minting
    let market_cap = borrow_global_mut<MarketCap>(0xA550C18);
    market_cap.total_value = market_cap.total_value + value;
    T { value }
    }
    // This can only be invoked by the Association address, and only a single time.
    // Currently, it is invoked in the genesis transaction
    // 初始化
    public initialize() {
    // Only callable by the Association address
    Transaction::assert(Transaction::sender() == 0xA550C18, 1);
    move_to_sender(MintCapability{});
    move_to_sender(MarketCap { total_value: 0 });
    }
    // Return the total value of all Libra in the system
    // 查询系统里币数量
    public market_cap(): u64 acquires MarketCap {
    borrow_global<MarketCap>(0xA550C18).total_value
    }
    // Create a new LibraCoin::T with a value of 0
    // 发布一个数量为零的币
    public zero(): T {
    T { value: 0 }
    }
    // Public accessor for the value of a coin
    // 查询币数量
    public value(coin: &T): u64 {
    coin.value
    }
    // Splits the given coin into two and returns them both
    // It leverages `Self::withdraw` for any verifications of the values
    // 分币:把一个币分成两个
    public split(coin: T, amount: u64): (T, T) {
    let other = withdraw(&mut coin, amount);
    (coin, other)
    }
    // "Divides" the given coin into two, where original coin is modified in place
    // The original coin will have value = original value - `amount`
    // The new coin will have a value = `amount`
    // Fails if the coins value is less than `amount`
    // 提款
    public withdraw(coin: &mut T, amount: u64): T {
    // Check that `amount` is less than the coin's value
    Transaction::assert(coin.value >= amount, 10);
    // Split the coin
    coin.value = coin.value - amount;
    T { value: amount }
    }
    // Merges two coins and returns a new coin whose value is equal to the sum of the two inputs
    // 合并两个币为一
    public join(coin1: T, coin2: T): T {
    deposit(&mut coin1, coin2);
    coin1
    }
    // "Merges" the two coins
    // The coin passed in by reference will have a value equal to the sum of the two coins
    // The `check` coin is consumed in the process
    // 充值
    public deposit(coin: &mut T, check: T) {
    let T { value } = check;
    coin.value = coin.value + value
    }
    // Destroy a coin
    // Fails if the value is non-zero
    // The amount of LibraCoin::T in the system is a tightly controlled property,
    // so you cannot "burn" any non-zero amount of LibraCoin::T
    // 销毁币
    public destroy_zero(coin: Self::T) {
    let T { value } = coin;
    Transaction::assert(value == 0, 11)
    }
    }
    因为Libra是一个专注支付的系统,所以它并到目前为止并不支持具有双向原子交易的功能,或者就是说,两个币之间进行原子交换的功能。
    下面这图是Libra交易流程图。

    Libra的交易流程是交易方首先把请求发到验证节点(第一步)。然后验证节点把请求送到虚拟机去检验请求是否有效的。比如是否有足够的钱、是否签了字、签字是否有效、格式是否正确、等等。如果请求是查询已有的交易,就送到数据库里查询交易然后返回交易信息给请求方(第二步)。如果不是查询而是新的交易,就送到内存池(第三步)。内存池进一步验证交易。比如交易序列号是否是最新的(第四步)。如果是合格的交易就把交易发送给其他验证节点的内存池(第五步)。如果这个验证节点是出块节点,共识层就从内存池里取出所有有效交易作为这个块包含的交易(第六步)。然后发给所有验证节点(第七步)。同时每个验证节点把包含在块里的交易送去执行模块(第八步)。执行模块在达成共识之前将交易送去虚拟机执行(第九步)。执行模块把执行结果加到内存池Merkle Tree里(第十步)。出块节点按照LibraBFT共识机制试图获取3f+1的验证节点签字(第十一步)。如果出块节点获取了足够的验证节点签字3f+1,执行模块就将内存池里的交易结果和Merkle Tree写进数据库(第十二步)。
    Libra链本身有一个叫做命令行的界面。也就是CLI就可以从命令行执行各种命令。然后可以建立一个本地的测试链,就说在一个计算机上可以建一个多节点的测试链。而在多个服务器上建立的测试链,则是比较麻烦。可以从Libra的代码挖掘到所需的信息。这个我们也是最开始是不知道的。那么Libra的主链现在就是pre-mainnet,它实际上是不公开的。包括里面的一些具体的部署是不公开的。这就是联盟链的特征。有些东西是不公开的。
    MOVE语言至今还就是一个IR语言,也就是一个中间过度的语言,还不是MOVE的正式语言,只是一个脚本语言。但是可以用这个脚本语言来写MOVE合约,比如合约里资产的定义和逻辑部分。
    数字货币与DeFi
    a) 支付与双向交易
    i. 支付是一方用货币交易而另一方用提供实物或服务作为交易
    ii. 双向交易是双方都用货币交易但是两个不同的货币
    b) DEX交易所
    i. 挂单
    ii. 撮合
    iii. 执行交易
    c) 借贷
    i. 抵押
    ii. 利息
    iii. 期限
    DeFi是包括支付和双向交易或者说叫做交换。支付是一方用货币进行交易,而另一方用实物,或者服务作为交易。这就是支付是一种单方的交易而交换是一种双方的交易。比如说我用港币换美元,这就是一个交换是两个不同币的交换。这个Libra里的MOVE合约现在是不支持的。但是这个原子交换是今后的DeFi当中的一个最最基本的需求。这个原子交换在区块链当中,现在最受关注的就是分布式或者去中心化交易所。因为中心化交易所的所有问题。当中涉及挂单撮合执行交易,这些都是需要解决这个双向交易或者说原子交换。在这个基础上才可以建立包括借贷抵押利息期限。这些都是智能合约需要完成的,而在以太坊里面已经很成熟了,但是在Libra里面,还没有出现,也就是今后需要把这个已经在以太坊上成熟的这些地方的协议迁移到Libra上面来,把以太坊合约改成MOVE合约。
    对Libra和MOVE的预测和期望
    首先是希望这个move的正式版能够发行。这是最重要的。还有就是我们希望Libra能够支持更多隐私交易。一个具体的形式就是链上的交易需要符合监管需求,但是链下的交易可以是全隐私的交易,就像闪电网络里面的支付通道一样,在这个通道里是可以实现全隐私的。但链上的这个交易可以符合监管的要求。这我们想到的一个最佳的组合就是半隐私链上交易,全隐私链下交易。
    支持原子交换这是目前区块链最热门的一个话题。下面这个图就是这个原子交换的示意图。

    如果你想要用一个BT C交换等价的ETH。那么你首先是要挂单,然后有人愿意要和你交换。然后你把这个BTC放在一个盒子里面锁上。然后把其中的一把钥匙给另一个人把这个盒子打开但把只能把ETH放进去。这样这个盒子里就有两个币了。两个互相交换后,大家再各自拿出自己需要的。这就是原子交换的一个基本原理,那么怎么去具体实现,这是有不同的做法。
    因为现在所有的DeFi协议基本上都是在支持ERC20标准上实现。所以也是希望今后在Libra上发行的数字货币也能支持ERC20类似的标准。这里把ERC20这个标准列出来了。

    他有这个几个主要的功能。第1个就是币的总数是多少。就是这个币的发行总数是多少。第二个就是他现在的存量是多少?第三是容许的是多少。转账是多少。授权转账交易。还有就是转账是从哪里到哪里。也有日志这是转账和授权结果。
    下面是这个0x协议API。
    contract IAuthorizable { // 授权接口
    /// @dev Gets all authorized addresses.
    /// @return Array of authorized addresses.
    function getAuthorizedAddresses() // 获取已经授权的地址列表
    external
    view
    returns (address[]);
    /// @dev Authorizes an address.
    /// @param target Address to authorize.
    function addAuthorizedAddress(address target) // 添加授权地址
    external;
    /// @dev Removes authorizion of an address.
    /// @param target Address to remove authorization from.
    function removeAuthorizedAddress(address target) // 除去授权地址
    external;
    /// @dev Removes authorizion of an address.
    /// @param target Address to remove authorization from.
    /// @param index Index of target in authorities array.
    function removeAuthorizedAddressAtIndex( // 从授权列表里除去第x名地址
    address target,
    uint256 index
    )
    external;
    }

    contract IAssetProxy is IAuthorizable // 已经授权的接口
    {
    /// @dev Transfers assets. Either succeeds or throws.
    /// @param assetData Byte array encoded for the respective asset proxy.
    /// @param from Address to transfer asset from.
    /// @param to Address to transfer asset to.
    /// @param amount Amount of asset to transfer.
    function transferFrom( // 从地址xxx向地址yyy转账
    bytes assetData,
    address from,
    address to,
    uint256 amount
    )
    external;
    /// @dev Gets the proxy id associated with the proxy address.
    /// @return Proxy id.
    function getProxyId() // 获取代理ID
    external
    view
    returns (uint8);
    }
    你可以看到0x协议API。首先是已经取得授权地址列表。然后就是给一个地址授权。就是把某个地址加到这个授权里面。还有就是取消某个地址授权的授权或从授权的地址列表里去掉。如果已经代理授权了,就可以从哪里到哪里转账了。最后就是获取这个代理ID。

    这个图是一个很简单的分布式交易所或去中心化交易所。有一个买家taker和卖家maker。卖家首先是把这个交易挂单order到订单表orderbook。买家看到了后就通过0x协议API吃单execute order。0x智能合约就把挂单和吃单分别发送到买家和卖家。这个中继层Relayer是最重要的。这些都是在MOVE合约里可能实现的。
    Libra 链开发进展
    Master 代码每天 几十个 commit 更新几百个文件
    测试链每两周更新一次(开始时从每天到每周) 最近一次是一个月之后
    预计 2020 年 3 月底出 beta 版
    预计 2020 年 6 月底出第一版
    预计 MOVE 语言正式版应该在 beta 版里发布
    监管方面
    2020 年也是美国大选年。在 2020 年底前上线应该是在美国大选后上线当然取决于大选局势。
    Facebook Pay法币支付:进展不顺也是必要的实习过程和双管齐下的做法。
    Libra是否最终支持多种数字货币:Facebook Pay应该支持多种法币。
    MOVE合约中资产和程序的分离是移植以太坊上虚拟币DeFi协议的核心挑战。
    需要重新思考建立在资产和程序分离上的真金白银数字货币DeFi协议而不是虚拟币DeFi 协议。在Libra上开发数字货币DeFi要在2020年后了,真金白银的数字货币DeFi将会建立在Libra上。
    MOVE VM将会归顺于WAVM。
    我今天的分享完毕,下面是问答时间。
    (未完待续)
    特别鸣谢赞助和主办机构【Soteria社区】、【魔笛手技术开发社区】、以下参与本次直播的社群 (排名不分先后) 和所有参与讨论及关注的群友。
    - Soteria 硬核科技社区(主播群)
    - Soteria SSDE 开发社区
    - 魔笛手技术开发社区
    - 数字万物讨论群
    - 盗火者区块链应用联盟
    - 新通证经济之无名高地
    - 7月线下线上交流学习社群
    - 加密数字货币与区块链生态系统
    - 区块链那些事
    - 世界区块链经济共同体总群
    - Metamask中文社区
    - 思宇认可和欣赏的朋友群
    - 西电区块链兴趣组
    免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到irena@aicoin.com,本平台相关工作人员将会进行核查。
    Soteria主题分享之邹杰:Libra技术专业解析 | 火星技术帖
    Soteria主题分享之邹杰:Libra技术专业解析 | 火星技术帖
    WWW.AICOIN.CN
    1 0 评论 0 股票
    请登录喜欢,分享和评论!

没有结果显示

没有结果显示

没有结果显示

没有结果显示

Google Analytics