We developed a deep fusion methodology of nondestructive in-situ thermal and ex-situ ultrasonic images for porosity detection in laser-based additive manufacturing (LBAM). A core challenge with the LBAM is the lack of fusion between successive layers of printed metal. Ultrasonic imaging can capture structural abnormalities by passing waves through successive layers. Alternatively, in-situ thermal images track the thermal history during fabrication. The proposed sensor fusion U-Net methodology fills the gap in fusing in-situ images with ex-situ images by employing a two-branch convolutional neural network (CNN) for feature extraction and segmentation to produce a 2D image of porosity. We modify the U-Net framework with the inception and long short term memory (LSTM) blocks. We validate the models by comparing our single modality models and fusion models with ground truth X-ray computed tomography (XCT) images. The inception U-Net fusion model achieved the highest mean intersection over union score of 0.93.