关于DIN模型中attention部分实现问题
Created by: tz28
论文attention中的out product是外积的意思吗? 论文中attention是将 [hist, target_expand, out product(hist,target_expand)]做了个拼接。而paddle中代码实现却是concat = fluid.layers.concat( [hist, target_expand, hist - target_expand, hist * target_expand], axis=2) 请问out product(hist,target_expand)和hist - target_expand, hist * target_expand之间存在什么关联?谢谢