Skip to main content

Column Wise Multiplication of Two matrix in python

Tutorial for Column Wise Multiplication of Two matrix by lists in python

Image result for python listssuppose we want to multiply two lists of different lengths in python how to do it?
Well there are many ways of Doing it
from itertools we can use map but it multiples lists of equal length only
using numpy but creating numpy arrays are not so useful it includes complicated stuff
here ill show you  can do it simply using logic

Suppose i have a 2d array and i want to multiply it column wise

[ 1 2 3  ]     [1]     [1, 4, 7]
[  4 5 6 ] *  [2] =  [2, 5, 8]
[  7 8 9 ]     [3]     [3, 6, 9]

Image result for row by column multiplicationSo in python we create a list inside a list

a=[[1,2,3],[4,5,6],[7,8,9]]
b=[1,2,3]
for row in range(len(a)):
print([a[col][row] for col in range(len(b)) ])


[1, 4, 7]
[2, 5, 8]
[3, 6, 9]

mul=[[1, 2, 3, 4, 0, 0], [0, 1, 2, 3, 4, 0], [0, 0, 1, 2, 3, 4], [4, 0, 0, 1, 2, 3], [3, 4, 0, 0, 1, 2], [2, 3, 4, 0, 0, 1]]
hn=[-3, 2, 1, 0, 0, 0]

so the ans should be
[[-3, 0, 0, 0, 0, 0], [-6, 2, 0, 0, 0, 0], [-9, 4, 1, 0, 0, 0], [-12, 6, 2, 0, 0, 0], [0, 8, 3, 0, 0, 0], [0, 0, 4, 0, 0, 0]]

for i in range(len(xn)):
print([mul[j][i]*hn[j] for j in range(len(hn))])
[-3, 0, 0, 0, 0, 0]
[-6, 2, 0, 0, 0, 0]
[-9, 4, 1, 0, 0, 0]
[-12, 6, 2, 0, 0, 0]
[0, 8, 3, 0, 0, 0]
[0, 0, 4, 0, 0, 0]

Now if we Want to add the ans to another list we simply do
next=[]
for i in range(len(xn)):
next.append( list(mul[j][i]*hn[j] for j in range(len(hn))))
print(next)
[[-3, 0, 0, 0, 0, 0], [-6, 2, 0, 0, 0, 0], [-9, 4, 1, 0, 0, 0], [-12, 6, 2, 0, 0, 0], [0, 8, 3, 0, 0, 0], [0, 0, 4, 0, 0, 0]]

Comments

Popular posts from this blog

Columnar Transposition Cipher

Columnar Transposition Cipher Introduction  The columnar transposition cipher is a fairly simple, easy to implement cipher. It is a transposition cipher that follows a simple rule for mixing up the characters in the plaintext to form the ciphertext. Although weak on its own, it can be combined with other ciphers, such as a substitution cipher, the combination of which can be more difficult to break than either cipher on it's own. The  ADFGVX cipher uses a columnar transposition to greatly improve its security. Example  The key for the columnar transposition cipher is a keyword e.g.  GERMAN . The row length that is used is the same as the length of the keyword. To encrypt a piece of text, e.g. defend the east wall of the castle we write it out in a special way in a number of rows (the keyword here is  GERMAN ): G E R M A N d e f e n d t h e e a s t w a l l o f t h e c a s t l e x x In the above example, the plaintext has been padded so that ...

UPD Attack in Python

UDP flood attack A UDP flood attack is a denial-of-service (DoS) attack using the User Datagram Protocol (UDP), a sessionless/connectionless computer networking protocol. Using UDP for denial-of-service attacks is not as straightforward as with the Transmission Control Protocol (TCP). However, a UDP flood attack can be initiated by sending a large number of UDP packets to random ports on a remote host. As a result, the distant host will: Check for the application listening at that port; See that no application listens at that port; Reply with an ICMP Destination Unreachable packet. Thus, for a large number of UDP packets, the victimized system will be forced into sending many ICMP packets, eventually leading it to be unreachable by other clients. The attacker(s) may also spoof the IP address of the UDP packets, ensuring that the excessive ICMP return packets do not reach them, and anonymizing their network location(s). Most operating systems mit...

Study of Support Vector Machines

Introduction to support vectors In machine learning, support vector machines (SVMs, also support vector networks) are supervised learning models with associated learning algorithms that analyze data used for classification and regression analysis. What are support vectors Support vectors are the data points that lie closest to the decision surface (or hyperplane) • They are the data points most difficult to classify • They have direct bearing on the optimum location of the decision surface • We can show that the optimal hyperplane stems from the function class with the lowest “capacity”= # of independent features/parameters Theoretical concept SVMs maximize the margin (Winston terminology: the ‘street’) around the separating hyperplane.  • The decision function is fully specified by a (usually very small) subset of training samples, the support vectors.  • This becomes a Quadratic programming problem that is easy to solve by standard methods Separation by Hyperplanes • Assu...