instance_id
stringlengths
17
39
repo
stringclasses
8 values
issue_id
stringlengths
14
34
pr_id
stringlengths
14
34
linking_methods
sequencelengths
1
3
base_commit
stringlengths
40
40
merge_commit
stringlengths
0
40
βŒ€
hints_text
sequencelengths
0
106
resolved_comments
sequencelengths
0
119
created_at
unknown
labeled_as
sequencelengths
0
7
problem_title
stringlengths
7
174
problem_statement
stringlengths
0
55.4k
gold_files
sequencelengths
0
10
gold_files_postpatch
sequencelengths
1
10
test_files
sequencelengths
0
60
gold_patch
stringlengths
220
5.83M
test_patch
stringlengths
386
194k
βŒ€
split_random
stringclasses
3 values
split_time
stringclasses
3 values
issue_start_time
timestamp[ns]
issue_created_at
unknown
issue_by_user
stringlengths
3
21
split_repo
stringclasses
3 values
provectus/kafka-ui/2008_2033
provectus/kafka-ui
provectus/kafka-ui/2008
provectus/kafka-ui/2033
[ "timestamp(timedelta=0.0, similarity=0.8563359218443907)", "connected" ]
7211a18b5770e18087ccbf159699860b31f700e8
7ba10c1b7a519d40f87c0970ad3b0ca5a8973d46
[]
[]
"2022-05-26T08:10:30Z"
[ "type/bug", "scope/frontend", "status/accepted", "status/confirmed" ]
No error displayed in case of wrong header submitting with Message producing
**Describe the bug** User should be notified about filled wrong value/format for Headers within Message produce form **Steps to Reproduce** Steps to reproduce the behavior: 1. Login too system 2. Navigate t Topic's profile 3. Turn to Messages tab 4. Produce Meesage 5. Enter wrong value/format within Headers 6. Press Send **Expected behavior** Should display error message about wrong header **Screenshots** https://user-images.githubusercontent.com/104780608/169745897-4b200628-a29c-4646-b095-4d49e0eaffea.mov
[ "kafka-ui-react-app/src/components/Topics/Topic/SendMessage/SendMessage.tsx" ]
[ "kafka-ui-react-app/src/components/Topics/Topic/SendMessage/SendMessage.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/Topics/Topic/SendMessage/SendMessage.tsx b/kafka-ui-react-app/src/components/Topics/Topic/SendMessage/SendMessage.tsx index a4087a36cf5..35894e9bfb9 100644 --- a/kafka-ui-react-app/src/components/Topics/Topic/SendMessage/SendMessage.tsx +++ b/kafka-ui-react-app/src/components/Topics/Topic/SendMessage/SendMessage.tsx @@ -102,8 +102,14 @@ const SendMessage: React.FC = () => { }) => { if (messageSchema) { const { partition, key, content } = data; - const headers = data.headers ? JSON.parse(data.headers) : undefined; const errors = validateMessage(key, content, messageSchema); + if (data.headers) { + try { + JSON.parse(data.headers); + } catch (error) { + errors.push('Wrong header format'); + } + } if (errors.length > 0) { const errorsHtml = errors.map((e) => `<li>${e}</li>`).join(''); dispatch( @@ -117,7 +123,7 @@ const SendMessage: React.FC = () => { ); return; } - + const headers = data.headers ? JSON.parse(data.headers) : undefined; try { await messagesApiClient.sendTopicMessages({ clusterName,
null
test
train
2022-05-30T14:55:16
"2022-05-23T04:55:29Z"
armenuikafka
train
provectus/kafka-ui/2014_2034
provectus/kafka-ui
provectus/kafka-ui/2014
provectus/kafka-ui/2034
[ "connected" ]
8393232a8301277fd397b5910134dc2a754fda52
cc109a712553ed41a98a3887958a885590886563
[]
[]
"2022-05-26T08:33:24Z"
[ "type/enhancement", "scope/frontend", "status/accepted" ]
UI improvement for partition dropdown field within Topic/Messages
**Describe the bug** 'x' icon for removing partition value is displayed by default, but could be better to show it with hovering on **Steps to Reproduce** Steps to reproduce the behavior: 1. Login to system 2. Navigate to Messages tab for a Topic 3. Check the partition dropdown within filtering section **Expected behavior** 'x' icon for partition dropdown field could be displayed not by default but only with hover on dropdown **Screenshots** <img width="1715" alt="x icon" src="https://user-images.githubusercontent.com/104780608/169801566-f19e98c9-834c-471a-97e1-e96ca9523404.png"> **Additional context** <!-- (Add any other context about the problem here) -->
[ "kafka-ui-react-app/src/components/common/MultiSelect/MultiSelect.styled.ts" ]
[ "kafka-ui-react-app/src/components/common/MultiSelect/MultiSelect.styled.ts" ]
[]
diff --git a/kafka-ui-react-app/src/components/common/MultiSelect/MultiSelect.styled.ts b/kafka-ui-react-app/src/components/common/MultiSelect/MultiSelect.styled.ts index f4767bd76df..0052a99411b 100644 --- a/kafka-ui-react-app/src/components/common/MultiSelect/MultiSelect.styled.ts +++ b/kafka-ui-react-app/src/components/common/MultiSelect/MultiSelect.styled.ts @@ -8,7 +8,6 @@ const MultiSelect = styled(ReactMultiSelect)<{ minWidth?: string }>` & > .dropdown-container { height: 32px; - * { cursor: ${({ disabled }) => (disabled ? 'not-allowed' : 'pointer')}; } @@ -17,6 +16,14 @@ const MultiSelect = styled(ReactMultiSelect)<{ minWidth?: string }>` height: 32px; color: ${({ disabled, theme }) => disabled ? theme.select.color.disabled : theme.select.color.active}; + & > .clear-selected-button { + display: none; + } + &:hover { + & > .clear-selected-button { + display: block; + } + } } } `;
null
train
train
2022-05-27T12:29:54
"2022-05-23T10:38:28Z"
armenuikafka
train
provectus/kafka-ui/2005_2036
provectus/kafka-ui
provectus/kafka-ui/2005
provectus/kafka-ui/2036
[ "timestamp(timedelta=0.0, similarity=0.8575432137963457)", "connected" ]
93a09a6327635078772386100bf955213d29bbb9
4f1078aabbb3cdbe22770f4b1d98df0420e16562
[ "Hello there armenuikafka! πŸ‘‹\n\nThank you and congratulations πŸŽ‰ for opening your very first issue in this project! πŸ’–\n\nIn case you want to claim this issue, please comment down below! We will try to get back to you as soon as we can. πŸ‘€" ]
[]
"2022-05-26T08:55:46Z"
[ "type/bug", "scope/frontend", "status/accepted", "status/confirmed" ]
Not functional 'Next' button within Messages/Topic
**Describe the bug** 'Next' button is not working for Messages within Topic **Steps to Reproduce** Steps to reproduce the behavior: 1. Open the Topic which has Messages 2. Navigate to 'Messages' tab 3. Press 'Next' button **Expected behavior** 'Next' button should not exist as it's not functional for Messages. **Screenshots** https://user-images.githubusercontent.com/104780608/169489025-f90118d6-e715-49bc-a29a-47a61c475798.mov
[ "kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/MessagesTable.tsx", "kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/__test__/MessagesTable.spec.tsx", "kafka-ui-react-app/src/components/Version/Version.tsx", "kafka-ui-react-app/src/lib/constants.ts" ]
[ "kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/MessagesTable.tsx", "kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/__test__/MessagesTable.spec.tsx", "kafka-ui-react-app/src/components/Version/Version.tsx", "kafka-ui-react-app/src/lib/constants.ts" ]
[]
diff --git a/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/MessagesTable.tsx b/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/MessagesTable.tsx index 6c2e69aa0fa..83192b3f908 100644 --- a/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/MessagesTable.tsx +++ b/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/MessagesTable.tsx @@ -1,12 +1,9 @@ import PageLoader from 'components/common/PageLoader/PageLoader'; import { Table } from 'components/common/table/Table/Table.styled'; import TableHeaderCell from 'components/common/table/TableHeaderCell/TableHeaderCell'; -import { SeekDirection, TopicMessage } from 'generated-sources'; -import styled from 'styled-components'; -import { compact, concat, groupBy, map, maxBy, minBy } from 'lodash'; +import { TopicMessage } from 'generated-sources'; import React, { useContext } from 'react'; import { useSelector } from 'react-redux'; -import { useHistory } from 'react-router-dom'; import { getTopicMessges, getIsTopicMessagesFetching, @@ -14,110 +11,52 @@ import { import TopicMessagesContext from 'components/contexts/TopicMessagesContext'; import Message from './Message'; -import * as S from './MessageContent/MessageContent.styled'; - -const MessagesPaginationWrapperStyled = styled.div` - padding: 16px; - display: flex; - justify-content: flex-start; -`; const MessagesTable: React.FC = () => { - const history = useHistory(); - - const { searchParams, isLive } = useContext(TopicMessagesContext); + const { isLive } = useContext(TopicMessagesContext); const messages = useSelector(getTopicMessges); const isFetching = useSelector(getIsTopicMessagesFetching); - const handleNextClick = () => { - const seekTo = searchParams.get('seekTo'); - - if (seekTo) { - const selectedPartitions = seekTo.split(',').map((item) => { - const [partition] = item.split('::'); - return { offset: 0, partition: parseInt(partition, 10) }; - }); - - const seekDirection = searchParams.get('seekDirection'); - const isBackward = seekDirection === SeekDirection.BACKWARD; - - const messageUniqs = map(groupBy(messages, 'partition'), (v) => - isBackward ? minBy(v, 'offset') : maxBy(v, 'offset') - ).map((message) => ({ - offset: message?.offset || 0, - partition: message?.partition || 0, - })); - - const nextSeekTo = compact( - map( - groupBy(concat(selectedPartitions, messageUniqs), 'partition'), - (v) => maxBy(v, 'offset') - ) - ) - .map(({ offset, partition }) => { - const offsetQuery = isBackward ? offset : offset + 1; - - return `${partition}::${offsetQuery}`; - }) - .join(','); - - searchParams.set('seekTo', nextSeekTo); - - history.push({ - search: `?${searchParams.toString()}`, - }); - } - }; - return ( - <> - <Table isFullwidth> - <thead> + <Table isFullwidth> + <thead> + <tr> + <TableHeaderCell> </TableHeaderCell> + <TableHeaderCell title="Offset" /> + <TableHeaderCell title="Partition" /> + <TableHeaderCell title="Timestamp" /> + <TableHeaderCell title="Key" /> + <TableHeaderCell title="Content" /> + <TableHeaderCell> </TableHeaderCell> + </tr> + </thead> + <tbody> + {messages.map((message: TopicMessage) => ( + <Message + key={[ + message.offset, + message.timestamp, + message.key, + message.partition, + ].join('-')} + message={message} + /> + ))} + {isFetching && isLive && !messages.length && ( + <tr> + <td colSpan={10}> + <PageLoader /> + </td> + </tr> + )} + {messages.length === 0 && !isFetching && ( <tr> - <TableHeaderCell> </TableHeaderCell> - <TableHeaderCell title="Offset" /> - <TableHeaderCell title="Partition" /> - <TableHeaderCell title="Timestamp" /> - <TableHeaderCell title="Key" /> - <TableHeaderCell title="Content" /> - <TableHeaderCell> </TableHeaderCell> + <td colSpan={10}>No messages found</td> </tr> - </thead> - <tbody> - {messages.map((message: TopicMessage) => ( - <Message - key={[ - message.offset, - message.timestamp, - message.key, - message.partition, - ].join('-')} - message={message} - /> - ))} - {isFetching && isLive && !messages.length && ( - <tr> - <td colSpan={10}> - <PageLoader /> - </td> - </tr> - )} - {messages.length === 0 && !isFetching && ( - <tr> - <td colSpan={10}>No messages found</td> - </tr> - )} - </tbody> - </Table> - {!isLive && ( - <MessagesPaginationWrapperStyled> - <S.PaginationButton onClick={handleNextClick}> - Next - </S.PaginationButton> - </MessagesPaginationWrapperStyled> - )} - </> + )} + </tbody> + </Table> ); }; diff --git a/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/__test__/MessagesTable.spec.tsx b/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/__test__/MessagesTable.spec.tsx index 351a55f5f03..e3b09f5b182 100644 --- a/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/__test__/MessagesTable.spec.tsx +++ b/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/__test__/MessagesTable.spec.tsx @@ -5,7 +5,6 @@ import MessagesTable from 'components/Topics/Topic/Details/Messages/MessagesTabl import { Router } from 'react-router-dom'; import { createMemoryHistory, MemoryHistory } from 'history'; import { SeekDirection, SeekType, TopicMessage } from 'generated-sources'; -import userEvent from '@testing-library/user-event'; import TopicMessagesContext, { ContextProps, } from 'components/contexts/TopicMessagesContext'; @@ -71,13 +70,6 @@ describe('MessagesTable', () => { it('should check the if no elements is rendered in the table', () => { expect(screen.getByText(/No messages found/i)).toBeInTheDocument(); }); - - it('should check if next button exist and check the click after next click', () => { - const nextBtnElement = screen.getByText(/next/i); - expect(nextBtnElement).toBeInTheDocument(); - userEvent.click(nextBtnElement); - expect(screen.getByText(/No messages found/i)).toBeInTheDocument(); - }); }); describe('Custom Setup with different props value', () => { @@ -90,44 +82,6 @@ describe('MessagesTable', () => { setUpComponent(searchParams, { ...contextValue, isLive: true }, [], true); expect(screen.getByRole('progressbar')).toBeInTheDocument(); }); - - it('should check the seekTo parameter in the url if no seekTo is found should noy change the history', () => { - const customSearchParam = new URLSearchParams(searchParamsValue); - - const mockedHistory = createMemoryHistory({ - initialEntries: [customSearchParam.toString()], - }); - jest.spyOn(mockedHistory, 'push'); - - setUpComponent(customSearchParam, contextValue, [], false, mockedHistory); - - userEvent.click(screen.getByRole('button', { name: 'Next' })); - expect(mockedHistory.push).toHaveBeenCalledWith({ - search: searchParamsValue.replace(seekToResult, '&seekTo=0%3A%3A1'), - }); - }); - - it('should check the seekTo parameter in the url if no seekTo is found should change the history', () => { - const customSearchParam = new URLSearchParams( - searchParamsValue.replace(seekToResult, '') - ); - - const mockedHistory = createMemoryHistory({ - initialEntries: [customSearchParam.toString()], - }); - jest.spyOn(mockedHistory, 'push'); - - setUpComponent( - customSearchParam, - { ...contextValue, searchParams: customSearchParam }, - [], - false, - mockedHistory - ); - - userEvent.click(screen.getByRole('button', { name: 'Next' })); - expect(mockedHistory.push).not.toHaveBeenCalled(); - }); }); describe('should render Messages table with data', () => { diff --git a/kafka-ui-react-app/src/components/Version/Version.tsx b/kafka-ui-react-app/src/components/Version/Version.tsx index 39ecc5227e2..ecda8f4a6ee 100644 --- a/kafka-ui-react-app/src/components/Version/Version.tsx +++ b/kafka-ui-react-app/src/components/Version/Version.tsx @@ -29,7 +29,6 @@ const Version: React.FC<VesionProps> = ({ tag, commit }) => { }, [tag]); const { outdated, latestTag } = latestVersionInfo; - return ( <S.Wrapper> <S.CurrentVersion>{tag}</S.CurrentVersion> diff --git a/kafka-ui-react-app/src/lib/constants.ts b/kafka-ui-react-app/src/lib/constants.ts index 1366db367a2..ba70329e7bb 100644 --- a/kafka-ui-react-app/src/lib/constants.ts +++ b/kafka-ui-react-app/src/lib/constants.ts @@ -16,7 +16,7 @@ export const BASE_PARAMS: ConfigurationParameters = { }; export const TOPIC_NAME_VALIDATION_PATTERN = /^[.,A-Za-z0-9_-]+$/; -export const SCHEMA_NAME_VALIDATION_PATTERN = /^[.,A-Za-z0-9_-]+$/; +export const SCHEMA_NAME_VALIDATION_PATTERN = /^[.,A-Za-z0-9_/-]+$/; export const TOPIC_CUSTOM_PARAMS_PREFIX = 'customParams'; export const TOPIC_CUSTOM_PARAMS: Record<string, string> = {
null
train
train
2022-05-30T11:45:21
"2022-05-23T04:26:52Z"
armenuikafka
train
provectus/kafka-ui/2023_2038
provectus/kafka-ui
provectus/kafka-ui/2023
provectus/kafka-ui/2038
[ "connected" ]
996e127a0256fe751c423732bb339e7149673afb
7211a18b5770e18087ccbf159699860b31f700e8
[]
[ "isn't return; same as return undefined ?", "not in this case, we need to write return value otherwise it throws error like this in catch block that 'Async arrow function expected no return value.'" ]
"2022-05-26T11:45:53Z"
[ "type/bug", "good first issue", "scope/frontend", "status/accepted", "status/confirmed" ]
In case of editing Partition number of Topic update is available only after refreshing
**Describe the bug** Partition count update is not available at once with edit, it's updating only after refreshing Topics **Steps to Reproduce** Steps to reproduce the behavior: 1. Login to System 2. Navigate to Topics 3. Open the Topic 4. Press 'Edit Settings' 5. Change the 'Number of partitions *' 6. Press 'Submit' 7. Turn back to Topic 8. Open the Settings to check the count of Partitions **Expected behavior** - Should be displayed Success message about update - Update should be available without refreshing the Topics **Screenshots** https://user-images.githubusercontent.com/104780608/170209809-f89ab523-e760-4174-982c-91778aa4ae94.mov
[ "kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts", "kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts" ]
[ "kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts", "kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts" ]
[]
diff --git a/kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts b/kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts index 997b88e0f2e..c81eb696e18 100644 --- a/kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts +++ b/kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts @@ -29,6 +29,10 @@ import { consumerGroupPayload } from 'redux/reducers/consumerGroups/__test__/fix import fetchMock from 'fetch-mock-jest'; import mockStoreCreator from 'redux/store/configureStore/mockStoreCreator'; import { getTypeAndPayload } from 'lib/testHelpers'; +import { + alertAdded, + showSuccessAlert, +} from 'redux/reducers/alerts/alertsSlice'; const topic = { name: 'topic', @@ -658,6 +662,17 @@ describe('topics Slice', () => { }); }); describe('updateTopicPartitionsCount', () => { + const RealDate = Date.now; + + beforeAll(() => { + global.Date.now = jest.fn(() => + new Date('2019-04-07T10:20:30Z').getTime() + ); + }); + + afterAll(() => { + global.Date.now = RealDate; + }); it('updateTopicPartitionsCount/fulfilled', async () => { fetchMock.patchOnce( `/api/clusters/${clusterName}/topics/${topicName}/partitions`, @@ -670,9 +685,21 @@ describe('topics Slice', () => { partitions: 1, }) ); - expect(getTypeAndPayload(store)).toEqual([ { type: updateTopicPartitionsCount.pending.type }, + { type: showSuccessAlert.pending.type }, + { + type: alertAdded.type, + payload: { + id: 'message-topic-local-1', + title: '', + type: 'success', + createdAt: global.Date.now(), + message: 'Number of partitions successfully increased!', + }, + }, + { type: fetchTopicDetails.pending.type }, + { type: showSuccessAlert.fulfilled.type }, { type: updateTopicPartitionsCount.fulfilled.type, }, diff --git a/kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts b/kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts index 9aac1079b42..3a35cc3d0ee 100644 --- a/kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts +++ b/kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts @@ -33,6 +33,7 @@ import { import { BASE_PARAMS } from 'lib/constants'; import { getResponse } from 'lib/errorHandling'; import { clearTopicMessages } from 'redux/reducers/topicMessages/topicMessagesSlice'; +import { showSuccessAlert } from 'redux/reducers/alerts/alertsSlice'; const apiClientConf = new Configuration(BASE_PARAMS); const topicsApiClient = new TopicsApi(apiClientConf); @@ -243,21 +244,30 @@ export const updateTopicPartitionsCount = createAsyncThunk< topicName: TopicName; partitions: number; } ->('topic/updateTopicPartitionsCount', async (payload, { rejectWithValue }) => { - try { - const { clusterName, topicName, partitions } = payload; - - await topicsApiClient.increaseTopicPartitions({ - clusterName, - topicName, - partitionsIncrease: { totalPartitionsCount: partitions }, - }); +>( + 'topic/updateTopicPartitionsCount', + async (payload, { rejectWithValue, dispatch }) => { + try { + const { clusterName, topicName, partitions } = payload; - return undefined; - } catch (err) { - return rejectWithValue(await getResponse(err as Response)); + await topicsApiClient.increaseTopicPartitions({ + clusterName, + topicName, + partitionsIncrease: { totalPartitionsCount: partitions }, + }); + dispatch( + showSuccessAlert({ + id: `message-${topicName}-${clusterName}-${partitions}`, + message: 'Number of partitions successfully increased!', + }) + ); + dispatch(fetchTopicDetails({ clusterName, topicName })); + return undefined; + } catch (err) { + return rejectWithValue(await getResponse(err as Response)); + } } -}); +); export const updateTopicReplicationFactor = createAsyncThunk< undefined,
null
test
train
2022-05-30T12:24:59
"2022-05-25T07:53:23Z"
armenuikafka
train
provectus/kafka-ui/2037_2041
provectus/kafka-ui
provectus/kafka-ui/2037
provectus/kafka-ui/2041
[ "timestamp(timedelta=0.0, similarity=0.8520623807568288)", "connected" ]
0c881b2df52e116669f68f66d4435055685008e2
6b1b47f02ba8b497b74db8dbdb6c9cebd101933e
[ "I'll take this one.", "@doomcrewinc hey, thanks!" ]
[]
"2022-05-27T03:14:18Z"
[ "good first issue", "scope/frontend", "status/accepted", "type/chore" ]
Rename connects to connectors
**Describe the bug** Connectors should be instead of Connects within Kafka connects **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** Steps to reproduce the behavior: 1. Navigate to Kafka connect **Expected behavior** Connectors, Failed Connectors, Failed Tasks should be displayed **Screenshots** <img width="1717" alt="connects vc connectors" src="https://user-images.githubusercontent.com/104780608/170475292-58c1849e-cead-4d42-9ba9-4de19dcc77ab.png">
[ "kafka-ui-react-app/src/components/Connect/List/List.tsx" ]
[ "kafka-ui-react-app/src/components/Connect/List/List.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/Connect/List/List.tsx b/kafka-ui-react-app/src/components/Connect/List/List.tsx index 0f1ab074b0c..d7e911c1901 100644 --- a/kafka-ui-react-app/src/components/Connect/List/List.tsx +++ b/kafka-ui-react-app/src/components/Connect/List/List.tsx @@ -70,8 +70,8 @@ const List: React.FC<ListProps> = ({ <Metrics.Wrapper> <Metrics.Section> <Metrics.Indicator - label="Connects" - title="Connects" + label="Connectors" + title="Connectors" fetching={areConnectsFetching} > {connectors.length}
null
train
train
2022-05-25T19:00:21
"2022-05-26T11:00:49Z"
armenuikafka
train
provectus/kafka-ui/2007_2042
provectus/kafka-ui
provectus/kafka-ui/2007
provectus/kafka-ui/2042
[ "connected" ]
4f1078aabbb3cdbe22770f4b1d98df0420e16562
996e127a0256fe751c423732bb339e7149673afb
[]
[]
"2022-05-27T08:51:10Z"
[ "type/bug", "scope/frontend", "status/accepted", "status/confirmed" ]
Partition value displays not right in case of selecting second time within Topic/Messages
**Describe the bug** Selected value for Partition dropdown should always displayed in the same format for Messages filtering **Steps to Reproduce** Steps to reproduce the behavior: 1. Login to system 2. Navigate to Topic profile which has Messages and more than one partition 3. Turn to Messages tab 4. Check filtering by partitions 5. Select/unselect the value for partitions dropdown **Expected behavior** The value always should be displayed the same way (partition #) **Screenshots** https://user-images.githubusercontent.com/104780608/169744114-071908a9-07e3-41fb-8d9b-f85ae738aad2.mov
[ "kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/Filters/Filters.tsx", "kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/Filters/utils.ts", "kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/__test__/utils.spec.ts" ]
[ "kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/Filters/Filters.tsx", "kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/Filters/utils.ts", "kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/__test__/utils.spec.ts" ]
[]
diff --git a/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/Filters/Filters.tsx b/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/Filters/Filters.tsx index 9eb7884b761..ad3aec6b815 100644 --- a/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/Filters/Filters.tsx +++ b/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/Filters/Filters.tsx @@ -181,7 +181,7 @@ const Filters: React.FC<FiltersProps> = ({ partitions.map((partition: Partition) => { return { value: partition.partition, - label: String(partition.partition), + label: `Partition #${partition.partition.toString()}`, }; }) ); diff --git a/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/Filters/utils.ts b/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/Filters/utils.ts index 8b29b5b820f..16dbc7db693 100644 --- a/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/Filters/utils.ts +++ b/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/Filters/utils.ts @@ -53,7 +53,7 @@ export const getSelectedPartitionsFromSeekToParam = ( if (selectedPartitionIds?.includes(partition)) { return { value: partition, - label: partition.toString(), + label: `Partition #${partition.toString()}`, }; } @@ -64,6 +64,6 @@ export const getSelectedPartitionsFromSeekToParam = ( return partitions.map(({ partition }) => ({ value: partition, - label: partition.toString(), + label: `Partition #${partition.toString()}`, })); }; diff --git a/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/__test__/utils.spec.ts b/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/__test__/utils.spec.ts index c58bc09b0ea..06a13d641b1 100644 --- a/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/__test__/utils.spec.ts +++ b/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/__test__/utils.spec.ts @@ -101,7 +101,7 @@ describe('utils', () => { it('returns parsed partition from params when partition list includes selected partition', () => { searchParams.set('seekTo', '42::0'); expect(getSelectedPartitionsFromSeekToParam(searchParams, part)).toEqual([ - { label: '42', value: 42 }, + { label: 'Partition #42', value: 42 }, ]); }); it('returns parsed partition from params when partition list NOT includes selected partition', () => { @@ -113,7 +113,7 @@ describe('utils', () => { it('returns partitions when param "seekTo" is not defined', () => { searchParams.delete('seekTo'); expect(getSelectedPartitionsFromSeekToParam(searchParams, part)).toEqual([ - { label: '42', value: 42 }, + { label: 'Partition #42', value: 42 }, ]); }); });
null
train
train
2022-05-30T12:15:27
"2022-05-23T04:41:02Z"
armenuikafka
train
provectus/kafka-ui/2000_2044
provectus/kafka-ui
provectus/kafka-ui/2000
provectus/kafka-ui/2044
[ "connected" ]
6b1b47f02ba8b497b74db8dbdb6c9cebd101933e
8393232a8301277fd397b5910134dc2a754fda52
[]
[]
"2022-05-27T10:13:36Z"
[ "scope/backend", "scope/frontend", "status/accepted", "type/chore" ]
UI: Double "v" in version name
Please remove the extra "v", we do have one in version name itself. <img width="310" alt="image" src="https://user-images.githubusercontent.com/1494347/169289884-421b2f60-018a-4ebf-a6ff-a30f8fdc7447.png">
[ "kafka-ui-api/pom.xml", "kafka-ui-contract/pom.xml", "pom.xml" ]
[ "kafka-ui-api/pom.xml", "kafka-ui-contract/pom.xml", "pom.xml" ]
[]
diff --git a/kafka-ui-api/pom.xml b/kafka-ui-api/pom.xml index e2b6ceddb98..9665aad565f 100644 --- a/kafka-ui-api/pom.xml +++ b/kafka-ui-api/pom.xml @@ -396,7 +396,7 @@ <configuration> <workingDirectory>../kafka-ui-react-app</workingDirectory> <environmentVariables> - <REACT_APP_TAG>v${project.version}</REACT_APP_TAG> + <REACT_APP_TAG>${project.version}</REACT_APP_TAG> <REACT_APP_COMMIT>${git.commit.id.abbrev}</REACT_APP_COMMIT> </environmentVariables> </configuration> diff --git a/kafka-ui-contract/pom.xml b/kafka-ui-contract/pom.xml index 1bbe2d8a275..581999c2953 100644 --- a/kafka-ui-contract/pom.xml +++ b/kafka-ui-contract/pom.xml @@ -138,7 +138,7 @@ <configuration> <workingDirectory>../kafka-ui-react-app</workingDirectory> <environmentVariables> - <REACT_APP_TAG>v${project.version}</REACT_APP_TAG> + <REACT_APP_TAG>${project.version}</REACT_APP_TAG> </environmentVariables> </configuration> <executions> diff --git a/pom.xml b/pom.xml index 185ea252e09..e7c5898b15f 100644 --- a/pom.xml +++ b/pom.xml @@ -86,5 +86,5 @@ <artifactId>kafka-ui</artifactId> <version>0.0.1-SNAPSHOT</version> <name>kafka-ui</name> - <description>Kafka metrics for UI panel</description> + <description>Web UI for Apache Kafka</description> </project>
null
train
train
2022-05-27T10:59:52
"2022-05-19T12:11:11Z"
Haarolean
train
provectus/kafka-ui/2031_2047
provectus/kafka-ui
provectus/kafka-ui/2031
provectus/kafka-ui/2047
[ "timestamp(timedelta=0.0, similarity=1.0000000000000002)", "connected" ]
7ba10c1b7a519d40f87c0970ad3b0ca5a8973d46
2a51f0ee14ec09e1e6323a6baf5cd16175ab5694
[]
[]
"2022-05-27T11:20:07Z"
[ "type/bug", "good first issue", "scope/frontend", "status/accepted", "status/confirmed" ]
Error message about Value field requiredness is shown once the Custom Parameter added within Create a new Topic form
**Describe the bug** Once Custom Parameter added, error message for Value field is shown, will be good to display it after pressing but letting it empty (to work like Topic Name field) **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** Steps to reproduce the behavior: 1. Navigate to Topics 2. Press Add a Topic 3. Click on 'Add Custom Parameter' **Expected behavior** - Error message for required fields should display after letting empty - Required fields should have "*" asterisk symbol to be clear for user **Screenshots** <img width="1717" alt="required fileds" src="https://user-images.githubusercontent.com/104780608/170314790-617be184-480a-40bf-95de-05b9ceef96cf.png"> **Additional context** _**Please make sure all the required fields have asterisk * symbol in a form**_ linked to https://github.com/provectus/kafka-ui/issues/1885 https://github.com/provectus/kafka-ui/issues/2030
[ "kafka-ui-react-app/src/components/Topics/New/New.tsx", "kafka-ui-react-app/src/components/Topics/shared/Form/CustomParams/CustomParamField.tsx", "kafka-ui-react-app/src/lib/yupExtended.ts" ]
[ "kafka-ui-react-app/src/components/Topics/New/New.tsx", "kafka-ui-react-app/src/components/Topics/shared/Form/CustomParams/CustomParamField.tsx", "kafka-ui-react-app/src/lib/yupExtended.ts" ]
[]
diff --git a/kafka-ui-react-app/src/components/Topics/New/New.tsx b/kafka-ui-react-app/src/components/Topics/New/New.tsx index 98e9aca0d99..cbb15f0141b 100644 --- a/kafka-ui-react-app/src/components/Topics/New/New.tsx +++ b/kafka-ui-react-app/src/components/Topics/New/New.tsx @@ -24,7 +24,7 @@ enum Filters { const New: React.FC = () => { const methods = useForm<TopicFormData>({ - mode: 'all', + mode: 'onChange', resolver: yupResolver(topicFormValidationSchema), }); diff --git a/kafka-ui-react-app/src/components/Topics/shared/Form/CustomParams/CustomParamField.tsx b/kafka-ui-react-app/src/components/Topics/shared/Form/CustomParams/CustomParamField.tsx index d56bd50d54e..72f28c5c461 100644 --- a/kafka-ui-react-app/src/components/Topics/shared/Form/CustomParams/CustomParamField.tsx +++ b/kafka-ui-react-app/src/components/Topics/shared/Form/CustomParams/CustomParamField.tsx @@ -59,7 +59,7 @@ const CustomParamField: React.FC<Props> = ({ newExistingFields.push(nameValue); setExistingFields(newExistingFields); setValue(`customParams.${index}.value`, TOPIC_CUSTOM_PARAMS[nameValue], { - shouldValidate: true, + shouldValidate: !!TOPIC_CUSTOM_PARAMS[nameValue], }); } }, [existingFields, index, nameValue, setExistingFields, setValue]); @@ -92,7 +92,7 @@ const CustomParamField: React.FC<Props> = ({ </FormError> </div> <div> - <InputLabel>Value</InputLabel> + <InputLabel>Value *</InputLabel> <Input name={`customParams.${index}.value` as const} hookFormOptions={{ diff --git a/kafka-ui-react-app/src/lib/yupExtended.ts b/kafka-ui-react-app/src/lib/yupExtended.ts index c8af702f0f3..f93b15de442 100644 --- a/kafka-ui-react-app/src/lib/yupExtended.ts +++ b/kafka-ui-react-app/src/lib/yupExtended.ts @@ -61,7 +61,7 @@ export const topicFormValidationSchema = yup.object().shape({ maxMessageBytes: yup.number().min(1).required(), customParams: yup.array().of( yup.object().shape({ - name: yup.string().required(), + name: yup.string().required('Custom parameter is required'), value: yup.string().required('Value is required'), }) ),
null
train
train
2022-05-30T16:36:04
"2022-05-25T16:36:10Z"
armenuikafka
train
provectus/kafka-ui/2030_2049
provectus/kafka-ui
provectus/kafka-ui/2030
provectus/kafka-ui/2049
[ "timestamp(timedelta=0.0, similarity=1.0000000000000002)", "connected" ]
71ac16357b9f2429f0f65826865c86debb8640a8
50afb26f95c80904e31a321d4572e8380b772354
[]
[]
"2022-05-27T12:20:57Z"
[ "type/bug", "good first issue", "scope/frontend", "status/accepted", "status/confirmed" ]
Fix the error messages about fields requiredness within Create a new Topic form
**Describe the bug** Please update the error messages for required fields to be more clear and simple **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** Steps to reproduce the behavior: 1. Navigate to Topics 2. Press Add a Topic 3. Press 'Submit' button **Expected behavior** Error messages about fields requiredness should be understandable for user **Screenshots** <img width="1717" alt="Create a topic-error messages" src="https://user-images.githubusercontent.com/104780608/170308942-29ee1640-d9fd-4d58-aa36-f354435db81a.png"> **Additional context** Linked to https://github.com/provectus/kafka-ui/issues/1885
[ "kafka-ui-react-app/src/lib/yupExtended.ts" ]
[ "kafka-ui-react-app/src/lib/yupExtended.ts" ]
[]
diff --git a/kafka-ui-react-app/src/lib/yupExtended.ts b/kafka-ui-react-app/src/lib/yupExtended.ts index f93b15de442..db185821095 100644 --- a/kafka-ui-react-app/src/lib/yupExtended.ts +++ b/kafka-ui-react-app/src/lib/yupExtended.ts @@ -52,13 +52,32 @@ export const topicFormValidationSchema = yup.object().shape({ TOPIC_NAME_VALIDATION_PATTERN, 'Only alphanumeric, _, -, and . allowed' ), - partitions: yup.number().min(1).required(), - replicationFactor: yup.number().min(1).required(), - minInsyncReplicas: yup.number().min(1).required(), + partitions: yup + .number() + .min(1) + .required() + .typeError('Number of partitions is required and must be a number'), + replicationFactor: yup + .number() + .min(1) + .required() + .typeError('Replication factor is required and must be a number'), + minInsyncReplicas: yup + .number() + .min(1) + .required() + .typeError('Min in sync replicas is required and must be a number'), cleanupPolicy: yup.string().required(), - retentionMs: yup.number().min(-1, 'Must be greater than or equal to -1'), + retentionMs: yup + .number() + .min(-1, 'Must be greater than or equal to -1') + .typeError('Time to retain data is required and must be a number'), retentionBytes: yup.number(), - maxMessageBytes: yup.number().min(1).required(), + maxMessageBytes: yup + .number() + .min(1) + .required() + .typeError('Maximum message size is required and must be a number'), customParams: yup.array().of( yup.object().shape({ name: yup.string().required('Custom parameter is required'),
null
train
train
2022-05-31T11:33:15
"2022-05-25T16:15:50Z"
armenuikafka
train
provectus/kafka-ui/1466_2056
provectus/kafka-ui
provectus/kafka-ui/1466
provectus/kafka-ui/2056
[ "timestamp(timedelta=1.0, similarity=0.9245798004130378)", "connected" ]
119c7d0107e02e735dcef9f15ba911a272dc592f
541e4018ec251ea371fd64b198b916b9fd198d99
[ "Hello there madrisan! πŸ‘‹\n\nThank you and congratulations πŸŽ‰ for opening your very first issue in this project! πŸ’–\n\nIn case you want to claim this issue, please comment down below! We will try to get back to you as soon as we can. πŸ‘€", "Thanks, we can actually schedule it for 0.4, I'll let you know if we'll need to test it out", "Ok. Thanks! I'll be glad to help testing this feature.", "I add a feature request about it: our Kafka cluster uses Active Directory for authn/authz and the user should be able to do what its user is granted to. The kafka-ui should use the provided credential to connect to the cluster using SASL_SSL.", "@angeloxx thanks for the suggestion, but that's totally another feature: #753 ", "The #753 plans to manage RBAC on the tool, this request is to pass-thru authentication and use providede credential to access to the cluster.", "@angeloxx ah, got you. Please raise a new issue", "Please try [this test build](https://s3.haarolean.dev/public/kafka-ui/issues-1466/kafka-ui_AD.tar.gz?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=6RDSCSYBXHZ9QERQC9S5%2F20220602%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20220602T090247Z&X-Amz-Expires=604800&X-Amz-Security-Token=eyJhbGciOiJIUzUxMiIsInR5cCI6IkpXVCJ9.eyJhY2Nlc3NLZXkiOiI2UkRTQ1NZQlhIWjlRRVJRQzlTNSIsImV4cCI6MTY1NDE2NDE0NywicGFyZW50IjoiaGFhcm9sZWFuIn0.ffmVfc0n0eJE7iJi5_SGMzFt4f7Z_dp_E1OzSrb5SsrBeKuyW0NCmiqbxfyKdJh5TqZCBAqc2QRdaGlpljlICg&X-Amz-SignedHeaders=host&versionId=null&X-Amz-Signature=b63d3f328ed6ba58284342b8000978824379efecf71602b4548af807db124480):\r\n\r\nImport it via `docker load < file.tar.gz`.\r\nImage name is `provectuslabs/kafka-ui:ldap_ad`.\r\nAdditionally to other LDAP settings add these:\r\n`OAUTH2_LDAP_AD=true`\r\n`OAUTH2_LDAP_AD_DOMAIN=google.com`\r\nThe link is available for 7 days. Let me know how it goes.", "I'll try to test it today, thanks.", "@madrisan updated the link. It'll be active for 7 days.", "Sorry. I'm unable to bypass the image arch issue\r\n```\r\n$ docker inspect --format='{{.Os}}/{{.Architecture}}/{{.Variant}}' provectuslabs/kafka-ui:ldap_ad\r\nlinux/arm64/v8\r\n```\r\nthat ends up in a runtime error (even when running with `--platform \"linux/amd64\"`)\r\n```\r\nimage was found but does not match the specified platform: wanted linux/amd64, actual: linux/arm64/v8\r\n```\r\nI've tried with `docker`, `docker` + `export DOCKER_BUILDKIT=1` and `podman`.", "Oh well, let's try [this one](https://s3.haarolean.dev/public/kafka-ui/issues-1466/kafka-ui_AD.tar.gz?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=702E1PC51NRF95FEIVWD%2F20220602%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20220602T115406Z&X-Amz-Expires=604800&X-Amz-Security-Token=eyJhbGciOiJIUzUxMiIsInR5cCI6IkpXVCJ9.eyJhY2Nlc3NLZXkiOiI3MDJFMVBDNTFOUkY5NUZFSVZXRCIsImV4cCI6MTY1NDE3NDM3NiwicGFyZW50IjoiaGFhcm9sZWFuIn0.p8weQrU0NsZlTmHHSQdloLmdoDMHK_0eRmFcR2ZfhCbJbJ0IOwjq2mLQY5842as28AfOGheM0y9ssW5YrWsa6A&X-Amz-SignedHeaders=host&versionId=null&X-Amz-Signature=4d54eb9ed6ac18841024166a8157bd9b17092e6f04ae8e2cf9529728b383554b), I've built it for amd64 this time.", "Thanks.\r\nI got an UNAUTHORIZED login error:\r\n```\r\n2022-06-02 16:19:01,029 DEBUG [boundedElastic-1] o.s.s.w.s.a.AuthenticationWebFilter: Authentication failed: simple bind failed: 10.100.16.38:636; nested exception is javax.naming.CommunicationException: simple bind failed: 10.10.1.16:636 [Root exception is java.net.SocketException: Connection or outbound has closed]\r\n2022-06-02 16:19:01,030 DEBUG [boundedElastic-1] o.s.s.w.s.DelegatingServerAuthenticationEntryPoint: Trying to match using org.springframework.security.config.web.server.ServerHttpSecurity$HttpBasicSpec$$Lambda$700/0x0000000801085840@11a68372\r\n2022-06-02 16:19:01,030 DEBUG [boundedElastic-1] o.s.s.w.s.DelegatingServerAuthenticationEntryPoint: No match found. Using default entry point org.springframework.security.web.server.authentication.HttpBasicServerAuthenticationEntryPoint@2ab53c6a\r\n2022-06-02 16:19:01,030 DEBUG [boundedElastic-1] o.s.w.s.a.HttpWebHandlerAdapter: [382d1f76-2] Completed 401 UNAUTHORIZED\r\n```\r\nI need to investigate more bit I'll be off for some days.", "Feels like it might be due to SSL. Do you use LDAPS by any chance?\r\nIf that's the case, let's try this:\r\nhttps://github.com/provectus/kafka-ui/blob/master/documentation/compose/kafka-ui-jmx-secured.yml#L27\r\nPass the truststore param, do a volume bind and add your LDAP's server SSL certificate into the keystore. Let me know how it goes.\r\n", "Yes, we use LDAPS and I thought also of a probable certificate issue. It's a recurrent issue. Thanks for the documentation! I'll try to test this setup next week.", "I was able to import our SSL certificate into the Java keystore but now I got all the time a login authentication error.\r\nThe cause is likely to be an incorrect configuration on my part.\r\nUnfortunately I don't have access to the AD logs so it's very difficult to get the root cause and fix it: p", "> I was able to import our SSL certificate into the Java keystore but now I got all the time a login authentication error. The cause is likely to be an incorrect configuration on my part. Unfortunately I don't have access to the AD logs so it's very difficult to get the root cause and fix it: p\r\n\r\nguess it's safe to merge then? :)", "I think so. This project has not reached the 1.0 version so one can expect some minor issues, and having this feature available will permit other people to test it (or use it if it work just fine). IMHO.", "Hi @madrisan I faced issue while authentication with LDAPS. I already added certificate of ldaps to keystore. But I still couldn't login. Do you know what I were missing?\r\nMy docker-compose.yml\r\n```\r\nversion: '3.4'\r\nservices:\r\n kafka-ui:\r\n container_name: kafka-ui\r\n image: provectuslabs/kafka-ui:v0.7.0\r\n ports:\r\n - 8080:8080\r\n environment:\r\n LOGGING_LEVEL_ROOT: DEBUG\r\n AUTH_TYPE: \"LDAP\"\r\n SPRING_LDAP_URLS: \"ldaps://ldap001.as1.domain.net:636\"\r\n SPRING_LDAP_USERFILTER_SEARCHBASE: \"OU=Users,OU=Evidence.com,DC=as1,DC=domain,DC=net\"\r\n SPRING_LDAP_ADMINUSER: \"CN=ldap_user,OU=Service Accounts,OU=ABC.com,DC=as1,DC=domain,DC=net\"\r\n SPRING_LDAP_ADMINPASSWORD: \"xxxx\"\r\n SPRING_LDAP_USERFILTER_SEARCHFILTER: \"(&(uid={0})(objectClass=inetOrgPerson))\"\r\n SPRING_LDAP_DN_PATTERN: \"cn={0},OU=Users,OU=ABC.com,DC=as1,DC=domain,DC=net\"\r\n OAUTH2.LDAP.ACTIVEDIRECTORY: \"true\"\r\n OAUTH2.LDAP.AΠ‘TIVEDIRECTORY.DOMAIN: \"as1.domain.net\"\r\n KAFKA_CLUSTERS_0_NAME: \"kafka\"\r\n KAFKA_CLUSTERS_0_PROPERTIES_SECURITY_PROTOCOL: SSL\r\n KAFKA_CLUSTERS_0_PROPERTIES_SSL_KEYSTORE_LOCATION: \"/ssl/cacerts\"\r\n KAFKA_CLUSTERS_0_PROPERTIES_SSL_KEYSTORE_PASSWORD: \"pass\"\r\n KAFKA_CLUSTERS_0_BOOTSTRAPSERVERS: \"kafka01:9091\"\r\n KAFKA_CLUSTERS_0_SSL_TRUSTSTORELOCATION: \"/ssl/cacerts\"\r\n KAFKA_CLUSTERS_0_SSL_TRUSTSTOREPASSWORD: \"pass\"\r\n KAFKA_CLUSTERS_0_PROPERTIES_SSL_ENDPOINT_IDENTIFICATION_ALGORITHM: ''\r\n DYNAMIC_CONFIG_ENABLED: 'true'\r\n volumes:\r\n - path/compose/ssl:/ssl\r\n```\r\nError logs:\r\n```\r\nkafka-ui | 2023-05-12 04:44:26,153 DEBUG [reactor-http-epoll-8] o.s.w.s.a.HttpWebHandlerAdapter: [660a4958-10] HTTP POST \"/login\"\r\nkafka-ui | 2023-05-12 04:44:26,154 DEBUG [reactor-http-epoll-8] o.s.s.w.s.u.m.OrServerWebExchangeMatcher: Trying to match using PathMatcherServerWebExchangeMatcher{pattern='/login', method=POST}\r\nkafka-ui | 2023-05-12 04:44:26,155 DEBUG [reactor-http-epoll-8] o.s.s.w.s.u.m.PathPatternParserServerWebExchangeMatcher: Checking match of request : '/login'; against '/login'\r\nkafka-ui | 2023-05-12 04:44:26,155 DEBUG [reactor-http-epoll-8] o.s.s.w.s.u.m.OrServerWebExchangeMatcher: matched\r\nkafka-ui | 2023-05-12 04:44:26,155 DEBUG [reactor-http-epoll-8] r.n.c.FluxReceive: [660a4958-3, L:/172.19.0.2:8080 - R:/172.19.0.1:37802] [terminated=false, cancelled=false, pending=0, error=null]: subscribing inbound receiver\r\nkafka-ui | 2023-05-12 04:44:26,155 DEBUG [reactor-http-epoll-8] o.s.h.c.FormHttpMessageReader: [660a4958-10] Read form fields [username, password] (content masked)\r\nkafka-ui | 2023-05-12 04:44:26,762 DEBUG [boundedElastic-3] o.s.s.w.s.a.AuthenticationWebFilter: Authentication failed: Connection to LDAP server failed\r\nkafka-ui | 2023-05-12 04:44:26,763 DEBUG [boundedElastic-3] o.s.s.w.s.DefaultServerRedirectStrategy: Redirecting to '/login?error'\r\n```", "I just wanted to add my comments for anyone else who might be facing issues with setting up Ldaps while running Provectus UI in a container.\r\n\r\nWhile the above settings are right, I was still facing issues and it seemed that adding the ldaps certs (server and ca) to the truststore was not enough. Because I was getting an error:\r\n\r\n```\r\nDEBUG [reactor-http-epoll-2] o.s.h.c.FormHttpMessageReader: [41223826-4] Read form fields [username, password] (content masked)\r\nDEBUG [boundedElastic-1] o.s.s.w.s.a.AuthenticationWebFilter: Authentication failed: Connection to LDAP server failed\r\n\r\n```\r\n\r\nUncertain why this error occurs, i checked with tcpdump/Wireshark the ldaps port comms, and turned out to be a cert error:\r\n```\r\n7 0.013859 source.ip.address ldaps.ip.address TLSv1.2 63 Alert (Level: Fatal, Description: Certificate Unknown)\r\n```\r\n\r\nGot help regarding this on Discord (many thanks), and adding another parameter helped, ldaps worked - have to specify a JAVA_OPTS environment variable inside the container (so -e when spinning up a container), where I specified the truststore / pwd of the truststore containing the ldaps certs.\r\n\r\n```\r\n-e JAVA_OPTS='-Djavax.net.ssl.trustStore=/app/truststore.jks -Djavax.net.ssl.trustStorePassword=truststore-pwd' \r\n```", "You could do it with JAVA_OPTS environment variable also." ]
[ "I would use `activedeirectory` instead of `ac`", "pls use description naming for field like `activeDirectoryDomain`" ]
"2022-05-31T09:21:41Z"
[ "type/enhancement", "scope/backend", "status/accepted" ]
LDAP/LDAPS authentication with Microsoft Active Directory
### Is your proposal related to a problem? <!-- Provide a clear and concise description of what the problem is. For example, "I'm always frustrated when..." --> LDAP/LDAPS authentication with Microsoft Active Directory seems not yet supported by Kafka-UI. ### Describe the solution you'd like <!-- Provide a clear and concise description of what you want to happen. --> It would be nice if Kafka-UI also support LDAP and LDAPS authentications with Microsoft Active Directory. According to the [Spring documentation](https://docs.spring.io/spring-security/site/docs/5.0.7.RELEASE/reference/html/ldap.html#ldap-active-directory) this is supported by the framework (but maybe will require one or more extra options). ### Describe alternatives you've considered <!-- Let us know about other solutions you've tried or researched. --> No alternatives available. ### Additional context <!-- Is there anything else you can add about the proposal? You might want to link to related issues here, if you haven't already. --> Our Kafka-UI service resides in a Kubernetes cluster and has been deployed by the official helm chart (v0.3.2). The configuration for LDAP we use is the following one (modulo server and domain names, that have been obfuscated): ``` kafka-ui: envs: config: AUTH_TYPE: "LDAP" SPRING_LDAP_URLS: "ldaps://ad01.domain.com:636" SPRING_LDAP_DN_PATTERN: "OU=people,DC=ad,DC=domain,DC=com" ``` The login windows reappears in a loop when I enter the credentials and no messages are printed in the pod's logs. I think it would also be useful to see something logged in case of user logins in the Kafka-UI interface. This issue has been briefly discussed with @Haarolean on the Kafka-UI discord channel.
[ "documentation/compose/auth-ldap.yaml", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/LdapSecurityConfig.java" ]
[ "documentation/compose/auth-ldap.yaml", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/LdapSecurityConfig.java" ]
[]
diff --git a/documentation/compose/auth-ldap.yaml b/documentation/compose/auth-ldap.yaml index 7c25adce5dd..12069639f24 100644 --- a/documentation/compose/auth-ldap.yaml +++ b/documentation/compose/auth-ldap.yaml @@ -29,14 +29,19 @@ services: AUTH_TYPE: "LDAP" SPRING_LDAP_URLS: "ldap://ldap:10389" SPRING_LDAP_DN_PATTERN: "cn={0},ou=people,dc=planetexpress,dc=com" -# USER SEARCH FILTER INSTEAD OF DN + +# ===== USER SEARCH FILTER INSTEAD OF DN ===== + # SPRING_LDAP_USERFILTER_SEARCHBASE: "dc=planetexpress,dc=com" # SPRING_LDAP_USERFILTER_SEARCHFILTER: "(&(uid={0})(objectClass=inetOrgPerson))" # LDAP ADMIN USER # SPRING_LDAP_ADMINUSER: "cn=admin,dc=planetexpress,dc=com" # SPRING_LDAP_ADMINPASSWORD: "GoodNewsEveryone" +# ===== ACTIVE DIRECTORY ===== +# OAUTH2.LDAP.ACTIVEDIRECTORY: true +# OAUTH2.LDAP.AБTIVEDIRECTORY.DOMAIN: "memelord.lol" ldap: image: rroemhild/test-openldap:latest diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/LdapSecurityConfig.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/LdapSecurityConfig.java index 0d629a88360..9681c36bc9f 100644 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/LdapSecurityConfig.java +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/LdapSecurityConfig.java @@ -14,8 +14,10 @@ import org.springframework.security.authentication.ReactiveAuthenticationManagerAdapter; import org.springframework.security.config.annotation.web.reactive.EnableWebFluxSecurity; import org.springframework.security.config.web.server.ServerHttpSecurity; +import org.springframework.security.ldap.authentication.AbstractLdapAuthenticationProvider; import org.springframework.security.ldap.authentication.BindAuthenticator; import org.springframework.security.ldap.authentication.LdapAuthenticationProvider; +import org.springframework.security.ldap.authentication.ad.ActiveDirectoryLdapAuthenticationProvider; import org.springframework.security.ldap.search.FilterBasedLdapUserSearch; import org.springframework.security.ldap.search.LdapUserSearch; import org.springframework.security.web.server.SecurityWebFilterChain; @@ -39,6 +41,11 @@ public class LdapSecurityConfig extends AbstractAuthSecurityConfig { @Value("${spring.ldap.userFilter.searchFilter:#{null}}") private String userFilterSearchFilter; + @Value("${oauth2.ldap.activeDirectory:false}") + private boolean isActiveDirectory; + @Value("${oauth2.ldap.aсtiveDirectory.domain:#{null}}") + private String activeDirectoryDomain; + @Bean public ReactiveAuthenticationManager authenticationManager(BaseLdapPathContextSource contextSource) { BindAuthenticator ba = new BindAuthenticator(contextSource); @@ -51,9 +58,15 @@ public ReactiveAuthenticationManager authenticationManager(BaseLdapPathContextSo ba.setUserSearch(userSearch); } - LdapAuthenticationProvider lap = new LdapAuthenticationProvider(ba); + AbstractLdapAuthenticationProvider authenticationProvider; + if (!isActiveDirectory) { + authenticationProvider = new LdapAuthenticationProvider(ba); + } else { + authenticationProvider = new ActiveDirectoryLdapAuthenticationProvider(activeDirectoryDomain, ldapUrls); + authenticationProvider.setUseAuthenticationRequestCredentials(true); + } - AuthenticationManager am = new ProviderManager(List.of(lap)); + AuthenticationManager am = new ProviderManager(List.of(authenticationProvider)); return new ReactiveAuthenticationManagerAdapter(am); } @@ -71,6 +84,9 @@ public BaseLdapPathContextSource contextSource() { @Bean public SecurityWebFilterChain configureLdap(ServerHttpSecurity http) { log.info("Configuring LDAP authentication."); + if (isActiveDirectory) { + log.info("Active Directory support for LDAP has been enabled."); + } http .authorizeExchange()
null
test
train
2022-07-14T11:38:55
"2022-01-24T09:45:32Z"
madrisan
train
provectus/kafka-ui/2015_2073
provectus/kafka-ui
provectus/kafka-ui/2015
provectus/kafka-ui/2073
[ "timestamp(timedelta=0.0, similarity=1.0)", "connected" ]
c0b3ca3bc604256848ba59afc21719632d37c367
70d3cee0bfc9a9beffa7f79263d82c61faeab378
[]
[]
"2022-05-31T12:16:58Z"
[ "type/bug", "scope/frontend", "status/accepted", "status/confirmed" ]
UI improvement for 'Partition' dropdown within 'Produce Message' form
**Describe the bug** The Partition dropdown values are overlapping with dropdown field within Produce Message form **Steps to Reproduce** Steps to reproduce the behavior: 1. Login to system 2. Navigate to Topic/Messages 3. Press 'Produce Message' 4. Click the 'Partition' dropdown field **Expected behavior** The values of dropdown should not overlap with dropdown field **Screenshots** https://user-images.githubusercontent.com/104780608/169802466-819c36bb-0a3d-4207-898c-5bd517ff317e.mov
[ "kafka-ui-react-app/src/components/Topics/Topic/SendMessage/SendMessage.tsx", "kafka-ui-react-app/src/components/Topics/Topic/SendMessage/__test__/SendMessage.spec.tsx" ]
[ "kafka-ui-react-app/src/components/Topics/Topic/SendMessage/SendMessage.tsx", "kafka-ui-react-app/src/components/Topics/Topic/SendMessage/__test__/SendMessage.spec.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/Topics/Topic/SendMessage/SendMessage.tsx b/kafka-ui-react-app/src/components/Topics/Topic/SendMessage/SendMessage.tsx index 2c96fd69fff..567e094e8aa 100644 --- a/kafka-ui-react-app/src/components/Topics/Topic/SendMessage/SendMessage.tsx +++ b/kafka-ui-react-app/src/components/Topics/Topic/SendMessage/SendMessage.tsx @@ -22,11 +22,19 @@ import { getPartitionsByTopicName, getTopicMessageSchemaFetched, } from 'redux/reducers/topics/selectors'; +import Select, { SelectOption } from 'components/common/Select/Select'; import useAppParams from 'lib/hooks/useAppParams'; import validateMessage from './validateMessage'; import * as S from './SendMessage.styled'; +type FieldValues = Partial<{ + key: string; + content: string; + headers: string; + partition: number | string; +}>; + const SendMessage: React.FC = () => { const dispatch = useAppDispatch(); const { clusterName, topicName } = useAppParams<RouteParamsClusterTopic>(); @@ -46,6 +54,10 @@ const SendMessage: React.FC = () => { getPartitionsByTopicName(state, topicName) ); const schemaIsFetched = useAppSelector(getTopicMessageSchemaFetched); + const selectPartitionOptions: Array<SelectOption> = partitions.map((p) => { + const value = String(p.partition); + return { value, label: value }; + }); const keyDefaultValue = React.useMemo(() => { if (!schemaIsFetched || !messageSchema) { @@ -70,12 +82,11 @@ const SendMessage: React.FC = () => { }, [messageSchema, schemaIsFetched]); const { - register, handleSubmit, formState: { isSubmitting, isDirty }, control, reset, - } = useForm({ + } = useForm<FieldValues>({ mode: 'onChange', defaultValues: { key: keyDefaultValue, @@ -156,24 +167,30 @@ const SendMessage: React.FC = () => { <S.Wrapper> <form onSubmit={handleSubmit(onSubmit)}> <div className="columns"> - <div className="column is-one-third"> - <label className="label" htmlFor="select"> + <div> + <label + className="label" + id="selectPartitionOptions" + htmlFor="selectPartitionOptions" + > Partition </label> - <div className="select is-block"> - <select - id="select" - defaultValue={partitions[0].partition} - disabled={isSubmitting} - {...register('partition')} - > - {partitions.map((partition) => ( - <option key={partition.partition} value={partition.partition}> - {partition.partition} - </option> - ))} - </select> - </div> + <Controller + control={control} + name="partition" + defaultValue={selectPartitionOptions[0].value} + render={({ field: { name, onChange } }) => ( + <Select + id="selectPartitionOptions" + aria-labelledby="selectPartitionOptions" + name={name} + onChange={onChange} + minWidth="100%" + options={selectPartitionOptions} + value={selectPartitionOptions[0].value} + /> + )} + /> </div> </div> diff --git a/kafka-ui-react-app/src/components/Topics/Topic/SendMessage/__test__/SendMessage.spec.tsx b/kafka-ui-react-app/src/components/Topics/Topic/SendMessage/__test__/SendMessage.spec.tsx index ba844c7eca2..dcc9e839dba 100644 --- a/kafka-ui-react-app/src/components/Topics/Topic/SendMessage/__test__/SendMessage.spec.tsx +++ b/kafka-ui-react-app/src/components/Topics/Topic/SendMessage/__test__/SendMessage.spec.tsx @@ -64,7 +64,12 @@ const renderAndSubmitData = async (error: string[] = []) => { await renderComponent(); expect(screen.queryByRole('progressbar')).not.toBeInTheDocument(); await act(() => { - userEvent.selectOptions(screen.getByLabelText('Partition'), '0'); + userEvent.click(screen.getByLabelText('Partition')); + }); + await act(() => { + userEvent.click(screen.getAllByRole('option')[1]); + }); + await act(() => { (validateMessage as Mock).mockImplementation(() => error); userEvent.click(screen.getByText('Send')); });
null
test
train
2022-06-01T14:40:46
"2022-05-23T10:44:58Z"
armenuikafka
train
provectus/kafka-ui/2062_2074
provectus/kafka-ui
provectus/kafka-ui/2062
provectus/kafka-ui/2074
[ "timestamp(timedelta=0.0, similarity=0.8420975608910389)", "connected" ]
a90aa52af54efd44222aee297e55ab40050d62cf
89c409a5d605dcb8dfc5d1452cd8b9ede0029c69
[]
[]
"2022-05-31T12:27:32Z"
[ "type/bug", "scope/frontend", "status/accepted", "status/confirmed" ]
Nothing happens upon creating a connector
1. create a connector successfully 2. Nothing happens in UI, no notifications/page redirections
[ "kafka-ui-react-app/src/components/Connect/New/New.tsx", "kafka-ui-react-app/src/components/Connect/New/NewContainer.ts", "kafka-ui-react-app/src/components/Connect/New/__tests__/New.spec.tsx" ]
[ "kafka-ui-react-app/src/components/Connect/New/New.tsx", "kafka-ui-react-app/src/components/Connect/New/NewContainer.ts", "kafka-ui-react-app/src/components/Connect/New/__tests__/New.spec.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/Connect/New/New.tsx b/kafka-ui-react-app/src/components/Connect/New/New.tsx index 64d307ce1b2..a7b307b283a 100644 --- a/kafka-ui-react-app/src/components/Connect/New/New.tsx +++ b/kafka-ui-react-app/src/components/Connect/New/New.tsx @@ -4,7 +4,7 @@ import useAppParams from 'lib/hooks/useAppParams'; import { Controller, FormProvider, useForm } from 'react-hook-form'; import { ErrorMessage } from '@hookform/error-message'; import { yupResolver } from '@hookform/resolvers/yup'; -import { Connect, Connector, NewConnector } from 'generated-sources'; +import { Connect } from 'generated-sources'; import { ClusterName, ConnectName } from 'redux/interfaces'; import { clusterConnectConnectorPath, ClusterNameRoute } from 'lib/paths'; import yup from 'lib/yupExtended'; @@ -16,6 +16,8 @@ import { FormError } from 'components/common/Input/Input.styled'; import Input from 'components/common/Input/Input'; import { Button } from 'components/common/Button/Button'; import PageHeading from 'components/common/PageHeading/PageHeading'; +import { createConnector } from 'redux/reducers/connect/connectSlice'; +import { useAppDispatch } from 'lib/hooks/redux'; import * as S from './New.styled'; @@ -28,11 +30,6 @@ export interface NewProps { fetchConnects(clusterName: ClusterName): unknown; areConnectsFetching: boolean; connects: Connect[]; - createConnector(payload: { - clusterName: ClusterName; - connectName: ConnectName; - newConnector: NewConnector; - }): Promise<{ connector: Connector | undefined }>; } interface FormValues { @@ -45,9 +42,9 @@ const New: React.FC<NewProps> = ({ fetchConnects, areConnectsFetching, connects, - createConnector, }) => { const { clusterName } = useAppParams<ClusterNameRoute>(); + const dispatch = useAppDispatch(); const navigate = useNavigate(); const methods = useForm<FormValues>({ @@ -83,15 +80,16 @@ const New: React.FC<NewProps> = ({ ); const onSubmit = async (values: FormValues) => { - const { connector } = await createConnector({ - clusterName, - connectName: values.connectName, - newConnector: { - name: values.name, - config: JSON.parse(values.config.trim()), - }, - }); - + const { connector } = await dispatch( + createConnector({ + clusterName, + connectName: values.connectName, + newConnector: { + name: values.name, + config: JSON.parse(values.config.trim()), + }, + }) + ).unwrap(); if (connector) { navigate( clusterConnectConnectorPath( diff --git a/kafka-ui-react-app/src/components/Connect/New/NewContainer.ts b/kafka-ui-react-app/src/components/Connect/New/NewContainer.ts index 23e577c4c12..d691043b54d 100644 --- a/kafka-ui-react-app/src/components/Connect/New/NewContainer.ts +++ b/kafka-ui-react-app/src/components/Connect/New/NewContainer.ts @@ -1,15 +1,12 @@ import { connect } from 'react-redux'; -import { - createConnector, - fetchConnects, -} from 'redux/reducers/connect/connectSlice'; +import { fetchConnects } from 'redux/reducers/connect/connectSlice'; import { RootState } from 'redux/interfaces'; import { getAreConnectsFetching, getConnects, } from 'redux/reducers/connect/selectors'; -import New, { NewProps } from './New'; +import New from './New'; const mapStateToProps = (state: RootState) => ({ areConnectsFetching: getAreConnectsFetching(state), @@ -18,7 +15,6 @@ const mapStateToProps = (state: RootState) => ({ const mapDispatchToProps = { fetchConnects, - createConnector: createConnector as unknown as NewProps['createConnector'], }; export default connect(mapStateToProps, mapDispatchToProps)(New); diff --git a/kafka-ui-react-app/src/components/Connect/New/__tests__/New.spec.tsx b/kafka-ui-react-app/src/components/Connect/New/__tests__/New.spec.tsx index 94cba3f5a24..d12b7af3d86 100644 --- a/kafka-ui-react-app/src/components/Connect/New/__tests__/New.spec.tsx +++ b/kafka-ui-react-app/src/components/Connect/New/__tests__/New.spec.tsx @@ -9,6 +9,7 @@ import { connects, connector } from 'redux/reducers/connect/__test__/fixtures'; import { fireEvent, screen, act } from '@testing-library/react'; import userEvent from '@testing-library/user-event'; import { ControllerRenderProps } from 'react-hook-form'; +import * as redux from 'react-redux'; jest.mock('components/common/PageLoader/PageLoader', () => 'mock-PageLoader'); jest.mock( @@ -53,7 +54,6 @@ describe('New', () => { fetchConnects={jest.fn()} areConnectsFetching={false} connects={connects} - createConnector={jest.fn()} {...props} /> </WithRoute>, @@ -70,30 +70,32 @@ describe('New', () => { }); it('calls createConnector on form submit', async () => { - const createConnector = jest.fn(); - renderComponent({ createConnector }); + const useDispatchSpy = jest.spyOn(redux, 'useDispatch'); + const useDispatchMock = jest.fn(() => ({ + unwrap: () => ({ connector }), + })) as jest.Mock; + useDispatchSpy.mockReturnValue(useDispatchMock); + + renderComponent(); await simulateFormSubmit(); - expect(createConnector).toHaveBeenCalledTimes(1); - expect(createConnector).toHaveBeenCalledWith({ - clusterName, - connectName: connects[0].name, - newConnector: { - name: 'my-connector', - config: { class: 'MyClass' }, - }, - }); + expect(useDispatchMock).toHaveBeenCalledTimes(1); }); it('redirects to connector details view on successful submit', async () => { - const createConnector = jest.fn().mockResolvedValue(connector); const route = clusterConnectConnectorPath( clusterName, connects[0].name, connector.name ); - renderComponent({ createConnector }); - mockHistoryPush(route); + + const useDispatchSpy = jest.spyOn(redux, 'useDispatch'); + const useDispatchMock = jest.fn(() => ({ + unwrap: () => ({ connector }), + })) as jest.Mock; + useDispatchSpy.mockReturnValue(useDispatchMock); + + renderComponent(); await simulateFormSubmit(); expect(mockHistoryPush).toHaveBeenCalledTimes(1); @@ -101,8 +103,13 @@ describe('New', () => { }); it('does not redirect to connector details view on unsuccessful submit', async () => { - const createConnector = jest.fn().mockResolvedValueOnce(undefined); - renderComponent({ createConnector }); + const useDispatchSpy = jest.spyOn(redux, 'useDispatch'); + const useDispatchMock = jest.fn(async () => ({ + unwrap: () => ({}), + })) as jest.Mock; + useDispatchSpy.mockReturnValue(useDispatchMock); + + renderComponent(); await simulateFormSubmit(); expect(mockHistoryPush).not.toHaveBeenCalled(); });
null
train
train
2022-05-31T13:49:47
"2022-05-31T11:20:43Z"
Haarolean
train
provectus/kafka-ui/2061_2075
provectus/kafka-ui
provectus/kafka-ui/2061
provectus/kafka-ui/2075
[ "connected" ]
89c409a5d605dcb8dfc5d1452cd8b9ede0029c69
c42858e722f7dd527ec2b0c3b809f7e820c23397
[ "#1961 " ]
[]
"2022-05-31T12:47:33Z"
[ "type/bug", "scope/frontend", "status/accepted", "status/confirmed", "type/regression" ]
Topic/Settings and Topic/Consumers are empty with corresponding URLs redirection
**Describe the bug** The pages stays empty in case of redirecting by URL for Topic/Settings and Topic/Consumers **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** Steps to reproduce the behavior: 1. Open the URL of Settings/Consumers tab for Topic (https://www.kafka-ui.provectus.io/ui/clusters/local/topics/LLLLLLLLLL-copy-2/settings) **Expected behavior** The topic corresponding tab should open with click on URL **Screenshots** <img width="1718" alt="empty settings" src="https://user-images.githubusercontent.com/104780608/171159742-9a17f9b4-9c38-43a8-afd5-c92c796e3d23.png"> **Additional context** linkes to https://github.com/provectus/kafka-ui/issues/1961
[ "kafka-ui-react-app/src/redux/reducers/topics/selectors.ts" ]
[ "kafka-ui-react-app/src/redux/reducers/topics/selectors.ts" ]
[]
diff --git a/kafka-ui-react-app/src/redux/reducers/topics/selectors.ts b/kafka-ui-react-app/src/redux/reducers/topics/selectors.ts index 311dcdc932a..bd47f08a41f 100644 --- a/kafka-ui-react-app/src/redux/reducers/topics/selectors.ts +++ b/kafka-ui-react-app/src/redux/reducers/topics/selectors.ts @@ -138,7 +138,7 @@ const getTopicName = (_: RootState, topicName: TopicName) => topicName; export const getTopicByName = createSelector( getTopicMap, getTopicName, - (topics, topicName) => topics[topicName] + (topics, topicName) => topics[topicName] || {} ); export const getPartitionsByTopicName = createSelector( @@ -201,9 +201,8 @@ export const getIsTopicInternal = createSelector( ); export const getTopicConsumerGroups = createSelector( - getTopicMap, - getTopicName, - (topics, topicName) => topics[topicName].consumerGroups || [] + getTopicByName, + ({ consumerGroups }) => consumerGroups || [] ); export const getMessageSchemaByTopicName = createSelector(
null
val
train
2022-05-31T15:10:16
"2022-05-31T11:13:49Z"
armenuikafka
train
provectus/kafka-ui/2052_2077
provectus/kafka-ui
provectus/kafka-ui/2052
provectus/kafka-ui/2077
[ "timestamp(timedelta=0.0, similarity=0.9210831648617049)", "connected" ]
c42858e722f7dd527ec2b0c3b809f7e820c23397
eb9aeef2d1bbc8f97a71ee80830a1e37f25e67ee
[]
[]
"2022-05-31T13:05:28Z"
[ "type/enhancement", "good first issue", "scope/frontend", "status/accepted" ]
Disable "produce message" button for r/o clusters
[ "kafka-ui-react-app/src/components/Topics/Topic/Details/Details.tsx" ]
[ "kafka-ui-react-app/src/components/Topics/Topic/Details/Details.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/Topics/Topic/Details/Details.tsx b/kafka-ui-react-app/src/components/Topics/Topic/Details/Details.tsx index f3d1c33d1a4..69540fdc284 100644 --- a/kafka-ui-react-app/src/components/Topics/Topic/Details/Details.tsx +++ b/kafka-ui-react-app/src/components/Topics/Topic/Details/Details.tsx @@ -113,6 +113,7 @@ const Details: React.FC<Props> = ({ buttonSize="M" buttonType="primary" to={`../${clusterTopicSendMessageRelativePath}`} + disabled={isReadOnly} > Produce Message </Button>
null
train
train
2022-05-31T15:17:40
"2022-05-27T13:42:35Z"
Haarolean
train
provectus/kafka-ui/2083_2089
provectus/kafka-ui
provectus/kafka-ui/2083
provectus/kafka-ui/2089
[ "connected" ]
f0673b60058f61d56f5e4538142ec36ffc37acb2
c0b3ca3bc604256848ba59afc21719632d37c367
[ "need to make UI use new API", "Seems the issue already exists. \r\n\r\n<img width=\"1718\" alt=\"ksql new\" src=\"https://user-images.githubusercontent.com/104780608/172543231-8b47ca56-58f4-4089-97b2-6a6bd2fcb807.png\">\r\n", "@armenuikafka this is not connected with ksql, your cognito session just closed", "@iliax thanks, will take a look" ]
[]
"2022-06-01T07:26:36Z"
[ "type/bug", "scope/backend", "scope/frontend", "status/accepted", "status/confirmed" ]
KSQL view data doesn't load
**Describe the bug** Sometimes KSQL DB data is not loading and 422 error appears **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** Steps to reproduce the behavior: 1. Navigate to KSQL DB **Expected behavior** All the data should display within KSQL DB **Screenshots** https://user-images.githubusercontent.com/104780608/171206802-370b2991-75e7-43e8-b137-5bd15c397c9d.mov **Additional context** Happens not every time
[ "kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/KsqlController.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlApiClient.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlServiceV2.java", "kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml" ]
[ "kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/KsqlController.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlApiClient.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlServiceV2.java", "kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml" ]
[ "kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/ksql/KsqlServiceV2Test.java" ]
diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/KsqlController.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/KsqlController.java index 910bb812480..62dc24fab28 100644 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/KsqlController.java +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/KsqlController.java @@ -6,6 +6,8 @@ import com.provectus.kafka.ui.model.KsqlCommandV2DTO; import com.provectus.kafka.ui.model.KsqlCommandV2ResponseDTO; import com.provectus.kafka.ui.model.KsqlResponseDTO; +import com.provectus.kafka.ui.model.KsqlStreamDescriptionDTO; +import com.provectus.kafka.ui.model.KsqlTableDescriptionDTO; import com.provectus.kafka.ui.model.KsqlTableResponseDTO; import com.provectus.kafka.ui.service.KsqlService; import com.provectus.kafka.ui.service.ksql.KsqlServiceV2; @@ -64,4 +66,16 @@ public Mono<ResponseEntity<Flux<KsqlResponseDTO>>> openKsqlResponsePipe(String c .columnNames(table.getColumnNames()) .values((List<List<Object>>) ((List<?>) (table.getValues()))))))); } + + @Override + public Mono<ResponseEntity<Flux<KsqlStreamDescriptionDTO>>> listStreams(String clusterName, + ServerWebExchange exchange) { + return Mono.just(ResponseEntity.ok(ksqlServiceV2.listStreams(getCluster(clusterName)))); + } + + @Override + public Mono<ResponseEntity<Flux<KsqlTableDescriptionDTO>>> listTables(String clusterName, + ServerWebExchange exchange) { + return Mono.just(ResponseEntity.ok(ksqlServiceV2.listTables(getCluster(clusterName)))); + } } diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlApiClient.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlApiClient.java index 339431b295e..9341dd30d3b 100644 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlApiClient.java +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlApiClient.java @@ -42,6 +42,10 @@ public static class KsqlResponseTable { String header; List<String> columnNames; List<List<JsonNode>> values; + + public Optional<JsonNode> getColumnValue(List<JsonNode> row, String column) { + return Optional.ofNullable(row.get(columnNames.indexOf(column))); + } } @Value diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlServiceV2.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlServiceV2.java index 13564f54a08..12a2e7d0b24 100644 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlServiceV2.java +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlServiceV2.java @@ -1,17 +1,24 @@ package com.provectus.kafka.ui.service.ksql; +import com.fasterxml.jackson.databind.JsonNode; import com.google.common.cache.Cache; import com.google.common.cache.CacheBuilder; +import com.provectus.kafka.ui.exception.KsqlApiException; import com.provectus.kafka.ui.exception.ValidationException; import com.provectus.kafka.ui.model.KafkaCluster; +import com.provectus.kafka.ui.model.KsqlStreamDescriptionDTO; +import com.provectus.kafka.ui.model.KsqlTableDescriptionDTO; import com.provectus.kafka.ui.service.ksql.KsqlApiClient.KsqlResponseTable; import java.util.Map; import java.util.UUID; import java.util.concurrent.TimeUnit; +import java.util.stream.Collectors; import lombok.Value; +import lombok.extern.slf4j.Slf4j; import org.springframework.stereotype.Service; import reactor.core.publisher.Flux; +@Slf4j @Service public class KsqlServiceV2 { @@ -45,4 +52,47 @@ public Flux<KsqlResponseTable> execute(String commandId) { .execute(cmd.ksql, cmd.streamProperties); } + public Flux<KsqlTableDescriptionDTO> listTables(KafkaCluster cluster) { + return new KsqlApiClient(cluster) + .execute("LIST TABLES;", Map.of()) + .flatMap(resp -> { + if (!resp.getHeader().equals("Tables")) { + log.error("Unexpected result header: {}", resp.getHeader()); + log.debug("Unexpected result {}", resp); + return Flux.error(new KsqlApiException("Error retrieving tables list")); + } + return Flux.fromIterable(resp.getValues() + .stream() + .map(row -> + new KsqlTableDescriptionDTO() + .name(resp.getColumnValue(row, "name").map(JsonNode::asText).orElse(null)) + .topic(resp.getColumnValue(row, "topic").map(JsonNode::asText).orElse(null)) + .keyFormat(resp.getColumnValue(row, "keyFormat").map(JsonNode::asText).orElse(null)) + .valueFormat(resp.getColumnValue(row, "valueFormat").map(JsonNode::asText).orElse(null)) + .isWindowed(resp.getColumnValue(row, "isWindowed").map(JsonNode::asBoolean).orElse(null))) + .collect(Collectors.toList())); + }); + } + + public Flux<KsqlStreamDescriptionDTO> listStreams(KafkaCluster cluster) { + return new KsqlApiClient(cluster) + .execute("LIST STREAMS;", Map.of()) + .flatMap(resp -> { + if (!resp.getHeader().equals("Streams")) { + log.error("Unexpected result header: {}", resp.getHeader()); + log.debug("Unexpected result {}", resp); + return Flux.error(new KsqlApiException("Error retrieving streams list")); + } + return Flux.fromIterable(resp.getValues() + .stream() + .map(row -> + new KsqlStreamDescriptionDTO() + .name(resp.getColumnValue(row, "name").map(JsonNode::asText).orElse(null)) + .topic(resp.getColumnValue(row, "topic").map(JsonNode::asText).orElse(null)) + .keyFormat(resp.getColumnValue(row, "keyFormat").map(JsonNode::asText).orElse(null)) + .valueFormat(resp.getColumnValue(row, "valueFormat").map(JsonNode::asText).orElse(null))) + .collect(Collectors.toList())); + }); + } + } diff --git a/kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml b/kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml index 5cc8d4d9b97..1416609982c 100644 --- a/kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml +++ b/kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml @@ -1514,6 +1514,50 @@ paths: schema: $ref: '#/components/schemas/KsqlCommandV2Response' + /api/clusters/{clusterName}/ksql/tables: + get: + tags: + - Ksql + summary: listTables + operationId: listTables + parameters: + - name: clusterName + in: path + required: true + schema: + type: string + responses: + 200: + description: OK + content: + application/json: + schema: + type: array + items: + $ref: '#/components/schemas/KsqlTableDescription' + + /api/clusters/{clusterName}/ksql/streams: + get: + tags: + - Ksql + summary: listStreams + operationId: listStreams + parameters: + - name: clusterName + in: path + required: true + schema: + type: string + responses: + 200: + description: OK + content: + application/json: + schema: + type: array + items: + $ref: '#/components/schemas/KsqlStreamDescription' + /api/clusters/{clusterName}/ksql/response: get: tags: @@ -2690,6 +2734,32 @@ components: required: - pipeId + KsqlTableDescription: + type: object + properties: + name: + type: string + topic: + type: string + keyFormat: + type: string + valueFormat: + type: string + isWindowed: + type: boolean + + KsqlStreamDescription: + type: object + properties: + name: + type: string + topic: + type: string + keyFormat: + type: string + valueFormat: + type: string + KsqlCommandResponse: type: object properties:
diff --git a/kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/ksql/KsqlServiceV2Test.java b/kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/ksql/KsqlServiceV2Test.java new file mode 100644 index 00000000000..22c87c9aecc --- /dev/null +++ b/kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/ksql/KsqlServiceV2Test.java @@ -0,0 +1,108 @@ +package com.provectus.kafka.ui.service.ksql; + +import static org.assertj.core.api.Assertions.assertThat; + +import com.provectus.kafka.ui.AbstractIntegrationTest; +import com.provectus.kafka.ui.container.KsqlDbContainer; +import com.provectus.kafka.ui.model.KafkaCluster; +import com.provectus.kafka.ui.model.KsqlStreamDescriptionDTO; +import com.provectus.kafka.ui.model.KsqlTableDescriptionDTO; +import java.util.Map; +import java.util.Set; +import java.util.concurrent.CopyOnWriteArraySet; +import org.junit.jupiter.api.AfterAll; +import org.junit.jupiter.api.BeforeAll; +import org.junit.jupiter.api.Test; +import org.testcontainers.utility.DockerImageName; + +class KsqlServiceV2Test extends AbstractIntegrationTest { + + private static final KsqlDbContainer KSQL_DB = new KsqlDbContainer( + DockerImageName.parse("confluentinc/ksqldb-server").withTag("0.24.0")) + .withKafka(kafka); + + private static final Set<String> STREAMS_TO_DELETE = new CopyOnWriteArraySet<>(); + private static final Set<String> TABLES_TO_DELETE = new CopyOnWriteArraySet<>(); + + @BeforeAll + static void init() { + KSQL_DB.start(); + } + + @AfterAll + static void cleanup() { + var client = new KsqlApiClient(KafkaCluster.builder().ksqldbServer(KSQL_DB.url()).build()); + + TABLES_TO_DELETE.forEach(t -> + client.execute(String.format("DROP TABLE IF EXISTS %s DELETE TOPIC;", t), Map.of()) + .blockLast()); + + STREAMS_TO_DELETE.forEach(s -> + client.execute(String.format("DROP STREAM IF EXISTS %s DELETE TOPIC;", s), Map.of()) + .blockLast()); + + KSQL_DB.stop(); + } + + private final KsqlServiceV2 ksqlService = new KsqlServiceV2(); + + @Test + void listStreamsReturnsAllKsqlStreams() { + var cluster = KafkaCluster.builder().ksqldbServer(KSQL_DB.url()).build(); + var streamName = "stream_" + System.currentTimeMillis(); + STREAMS_TO_DELETE.add(streamName); + + new KsqlApiClient(cluster) + .execute( + String.format("CREATE STREAM %s ( " + + " c1 BIGINT KEY, " + + " c2 VARCHAR " + + " ) WITH ( " + + " KAFKA_TOPIC = '%s_topic', " + + " PARTITIONS = 1, " + + " VALUE_FORMAT = 'JSON' " + + " );", streamName, streamName), + Map.of()) + .blockLast(); + + var streams = ksqlService.listStreams(cluster).collectList().block(); + assertThat(streams).contains( + new KsqlStreamDescriptionDTO() + .name(streamName.toUpperCase()) + .topic(streamName + "_topic") + .keyFormat("KAFKA") + .valueFormat("JSON") + ); + } + + @Test + void listTablesReturnsAllKsqlTables() { + var cluster = KafkaCluster.builder().ksqldbServer(KSQL_DB.url()).build(); + var tableName = "table_" + System.currentTimeMillis(); + TABLES_TO_DELETE.add(tableName); + + new KsqlApiClient(cluster) + .execute( + String.format("CREATE TABLE %s ( " + + " c1 BIGINT PRIMARY KEY, " + + " c2 VARCHAR " + + " ) WITH ( " + + " KAFKA_TOPIC = '%s_topic', " + + " PARTITIONS = 1, " + + " VALUE_FORMAT = 'JSON' " + + " );", tableName, tableName), + Map.of()) + .blockLast(); + + var tables = ksqlService.listTables(cluster).collectList().block(); + assertThat(tables).contains( + new KsqlTableDescriptionDTO() + .name(tableName.toUpperCase()) + .topic(tableName + "_topic") + .keyFormat("KAFKA") + .valueFormat("JSON") + .isWindowed(false) + ); + } + +} \ No newline at end of file
train
train
2022-06-01T10:33:16
"2022-05-31T15:12:39Z"
armenuikafka
train
provectus/kafka-ui/2091_2092
provectus/kafka-ui
provectus/kafka-ui/2091
provectus/kafka-ui/2092
[ "timestamp(timedelta=0.0, similarity=0.9363814644372193)", "connected" ]
eb9aeef2d1bbc8f97a71ee80830a1e37f25e67ee
f0673b60058f61d56f5e4538142ec36ffc37acb2
[]
[]
"2022-06-01T08:19:31Z"
[ "scope/frontend", "status/accepted", "type/chore" ]
Remove History from dependencies
### Describe the solution you'd like The history library is a direct dependency of v6 (not a peer dep), so you won't ever import or use it directly. Instead, you'll use the `useNavigate()` hook for all navigation . https://reactrouter.com/docs/en/v6/upgrading/v5
[ "kafka-ui-react-app/package-lock.json", "kafka-ui-react-app/package.json" ]
[ "kafka-ui-react-app/package-lock.json", "kafka-ui-react-app/package.json" ]
[]
diff --git a/kafka-ui-react-app/package-lock.json b/kafka-ui-react-app/package-lock.json index 60907257d40..20f1c48d384 100644 --- a/kafka-ui-react-app/package-lock.json +++ b/kafka-ui-react-app/package-lock.json @@ -75,7 +75,6 @@ "eslint-plugin-react": "^7.29.4", "eslint-plugin-react-hooks": "^4.5.0", "fetch-mock-jest": "^1.5.1", - "history": "^5.0.0", "http-proxy-middleware": "^2.0.6", "husky": "^7.0.1", "jest-sonar-reporter": "^2.0.0", diff --git a/kafka-ui-react-app/package.json b/kafka-ui-react-app/package.json index 5b28ae615ef..f30fba154b7 100644 --- a/kafka-ui-react-app/package.json +++ b/kafka-ui-react-app/package.json @@ -106,7 +106,6 @@ "eslint-plugin-react": "^7.29.4", "eslint-plugin-react-hooks": "^4.5.0", "fetch-mock-jest": "^1.5.1", - "history": "^5.0.0", "http-proxy-middleware": "^2.0.6", "husky": "^7.0.1", "jest-sonar-reporter": "^2.0.0",
null
train
train
2022-05-31T17:04:33
"2022-06-01T08:12:07Z"
Mgrdich
train
provectus/kafka-ui/2086_2094
provectus/kafka-ui
provectus/kafka-ui/2086
provectus/kafka-ui/2094
[ "timestamp(timedelta=0.0, similarity=0.8694610240446166)", "connected" ]
a211c412074e7e0efa03ad7b2fe9eaabb311060c
9eae6b1f0ab89fc4a8a870f7173001a62030018f
[]
[]
"2022-06-01T12:10:52Z"
[ "type/bug", "scope/frontend", "status/accepted", "status/confirmed" ]
Not displayed success message about removing a Topic
**Describe the bug** Success message is not displayed with removing a Topic from both: Topics page and the Topic's profile **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** Steps to reproduce the behavior: 1. Navigate to Topics 2. Press menu icon 3. Select Remove Topic **Expected behavior** The success message about Topic removing should be displayed
[ "kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts", "kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts" ]
[ "kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts", "kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts" ]
[]
diff --git a/kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts b/kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts index c81eb696e18..c4a0657970a 100644 --- a/kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts +++ b/kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts @@ -368,7 +368,16 @@ describe('topics Slice', () => { describe('Thunks', () => { const store = mockStoreCreator; const topicName = topic.name; + const RealDate = Date.now; + beforeAll(() => { + global.Date.now = jest.fn(() => + new Date('2019-04-07T10:20:30Z').getTime() + ); + }); + afterAll(() => { + global.Date.now = RealDate; + }); afterEach(() => { fetchMock.restore(); store.clearActions(); @@ -495,6 +504,18 @@ describe('topics Slice', () => { expect(getTypeAndPayload(store)).toEqual([ { type: deleteTopic.pending.type }, + { type: showSuccessAlert.pending.type }, + { + type: alertAdded.type, + payload: { + id: 'message-topic-local', + title: '', + type: 'success', + createdAt: global.Date.now(), + message: 'Topic successfully deleted!', + }, + }, + { type: showSuccessAlert.fulfilled.type }, { type: deleteTopic.fulfilled.type, payload: { topicName }, @@ -662,17 +683,6 @@ describe('topics Slice', () => { }); }); describe('updateTopicPartitionsCount', () => { - const RealDate = Date.now; - - beforeAll(() => { - global.Date.now = jest.fn(() => - new Date('2019-04-07T10:20:30Z').getTime() - ); - }); - - afterAll(() => { - global.Date.now = RealDate; - }); it('updateTopicPartitionsCount/fulfilled', async () => { fetchMock.patchOnce( `/api/clusters/${clusterName}/topics/${topicName}/partitions`, diff --git a/kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts b/kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts index 3a35cc3d0ee..dfeb437b242 100644 --- a/kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts +++ b/kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts @@ -201,11 +201,16 @@ export const updateTopic = createAsyncThunk< export const deleteTopic = createAsyncThunk< { topicName: TopicName }, DeleteTopicRequest ->('topic/deleteTopic', async (payload, { rejectWithValue }) => { +>('topic/deleteTopic', async (payload, { rejectWithValue, dispatch }) => { try { - const { topicName } = payload; + const { topicName, clusterName } = payload; await topicsApiClient.deleteTopic(payload); - + dispatch( + showSuccessAlert({ + id: `message-${topicName}-${clusterName}`, + message: 'Topic successfully deleted!', + }) + ); return { topicName }; } catch (err) { return rejectWithValue(await getResponse(err as Response));
null
train
train
2022-06-02T11:33:13
"2022-05-31T17:58:09Z"
armenuikafka
train
provectus/kafka-ui/2087_2095
provectus/kafka-ui
provectus/kafka-ui/2087
provectus/kafka-ui/2095
[ "timestamp(timedelta=0.0, similarity=0.8679418280686718)", "connected" ]
c1bdbec2b2d9158c9c210f77748d330812e28e2a
0f0d51d386034d2822f8b8a91496962a31ed647a
[]
[]
"2022-06-01T12:41:45Z"
[ "type/bug", "good first issue", "scope/frontend", "status/accepted", "status/confirmed" ]
Loading icon stays with removIng more than one Topics
**Describe the bug** In case of removing more than one Topics the load icon stays in process **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** Steps to reproduce the behavior: 1. Navigate to Topics 2. Check more than one ToPic 3. Press 'Delete selected topics' **Expected behavior** The loading icon should disappear and topics should be removed wIthout refreshing **Screenshots** https://user-images.githubusercontent.com/104780608/171255275-b75d6c40-2c5e-4ed7-8dd0-55935cdeef84.mov
[ "kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts", "kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts" ]
[ "kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts", "kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts" ]
[]
diff --git a/kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts b/kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts index c4a0657970a..70d85d46dc4 100644 --- a/kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts +++ b/kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts @@ -557,6 +557,7 @@ describe('topics Slice', () => { { type: deleteTopics.pending.type }, { type: deleteTopic.pending.type }, { type: deleteTopic.pending.type }, + { type: fetchTopicsList.pending.type }, { type: deleteTopics.fulfilled.type }, ]); }); diff --git a/kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts b/kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts index dfeb437b242..525f7548419 100644 --- a/kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts +++ b/kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts @@ -313,6 +313,7 @@ export const deleteTopics = createAsyncThunk< topicNames.forEach((topicName) => { dispatch(deleteTopic({ clusterName, topicName })); }); + dispatch(fetchTopicsList({ clusterName })); return undefined; } catch (err) {
null
train
train
2022-06-03T14:36:06
"2022-05-31T18:18:08Z"
armenuikafka
train
provectus/kafka-ui/2083_2098
provectus/kafka-ui
provectus/kafka-ui/2083
provectus/kafka-ui/2098
[ "connected" ]
657025dfd46cad53a22592e6e0d083191c18b3ad
070fba4d08c7667b3b21bfb77a4ab3066c91fafb
[ "need to make UI use new API", "Seems the issue already exists. \r\n\r\n<img width=\"1718\" alt=\"ksql new\" src=\"https://user-images.githubusercontent.com/104780608/172543231-8b47ca56-58f4-4089-97b2-6a6bd2fcb807.png\">\r\n", "@armenuikafka this is not connected with ksql, your cognito session just closed", "@iliax thanks, will take a look" ]
[ "may be we should just define interface of `headers` array", "So, we use this often, lets create own Type \r\n\r\n```ts\r\ntype KsqlDescriptionAccessor = keyof KsqlDescription;\r\n```" ]
"2022-06-02T08:18:19Z"
[ "type/bug", "scope/backend", "scope/frontend", "status/accepted", "status/confirmed" ]
KSQL view data doesn't load
**Describe the bug** Sometimes KSQL DB data is not loading and 422 error appears **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** Steps to reproduce the behavior: 1. Navigate to KSQL DB **Expected behavior** All the data should display within KSQL DB **Screenshots** https://user-images.githubusercontent.com/104780608/171206802-370b2991-75e7-43e8-b137-5bd15c397c9d.mov **Additional context** Happens not every time
[ "kafka-ui-react-app/src/components/KsqlDb/List/List.tsx", "kafka-ui-react-app/src/components/KsqlDb/List/ListItem.tsx", "kafka-ui-react-app/src/components/KsqlDb/List/__test__/ListItem.spec.tsx", "kafka-ui-react-app/src/redux/interfaces/ksqlDb.ts", "kafka-ui-react-app/src/redux/reducers/ksqlDb/ksqlDbSlice.ts" ]
[ "kafka-ui-react-app/src/components/KsqlDb/List/List.tsx", "kafka-ui-react-app/src/components/KsqlDb/List/ListItem.tsx", "kafka-ui-react-app/src/components/KsqlDb/List/__test__/ListItem.spec.tsx", "kafka-ui-react-app/src/redux/interfaces/ksqlDb.ts", "kafka-ui-react-app/src/redux/reducers/ksqlDb/ksqlDbSlice.ts" ]
[]
diff --git a/kafka-ui-react-app/src/components/KsqlDb/List/List.tsx b/kafka-ui-react-app/src/components/KsqlDb/List/List.tsx index 1c0d5ffcc1c..8a0e2b42a9c 100644 --- a/kafka-ui-react-app/src/components/KsqlDb/List/List.tsx +++ b/kafka-ui-react-app/src/components/KsqlDb/List/List.tsx @@ -11,8 +11,15 @@ import PageHeading from 'components/common/PageHeading/PageHeading'; import { Table } from 'components/common/table/Table/Table.styled'; import TableHeaderCell from 'components/common/table/TableHeaderCell/TableHeaderCell'; import { Button } from 'components/common/Button/Button'; +import { KsqlDescription } from 'redux/interfaces/ksqlDb'; -const headers = [ +export type KsqlDescriptionAccessor = keyof KsqlDescription; + +interface HeadersType { + Header: string; + accessor: KsqlDescriptionAccessor; +} +const headers: HeadersType[] = [ { Header: 'Type', accessor: 'type' }, { Header: 'Name', accessor: 'name' }, { Header: 'Topic', accessor: 'topic' }, diff --git a/kafka-ui-react-app/src/components/KsqlDb/List/ListItem.tsx b/kafka-ui-react-app/src/components/KsqlDb/List/ListItem.tsx index 65390caa2cc..59ec84095e7 100644 --- a/kafka-ui-react-app/src/components/KsqlDb/List/ListItem.tsx +++ b/kafka-ui-react-app/src/components/KsqlDb/List/ListItem.tsx @@ -1,14 +1,16 @@ import React from 'react'; +import { KsqlDescription } from 'redux/interfaces/ksqlDb'; +import { KsqlDescriptionAccessor } from 'components/KsqlDb/List/List'; interface Props { - accessors: string[]; - data: Record<string, string>; + accessors: KsqlDescriptionAccessor[]; + data: KsqlDescription; } const ListItem: React.FC<Props> = ({ accessors, data }) => { return ( <tr> - {accessors.map((accessor: string) => ( + {accessors.map((accessor) => ( <td key={accessor}>{data[accessor]}</td> ))} </tr> diff --git a/kafka-ui-react-app/src/components/KsqlDb/List/__test__/ListItem.spec.tsx b/kafka-ui-react-app/src/components/KsqlDb/List/__test__/ListItem.spec.tsx index 0942ee89185..9014d06d4f2 100644 --- a/kafka-ui-react-app/src/components/KsqlDb/List/__test__/ListItem.spec.tsx +++ b/kafka-ui-react-app/src/components/KsqlDb/List/__test__/ListItem.spec.tsx @@ -3,6 +3,7 @@ import { clusterKsqlDbPath } from 'lib/paths'; import { render, WithRoute } from 'lib/testHelpers'; import { screen } from '@testing-library/dom'; import ListItem from 'components/KsqlDb/List/ListItem'; +import { KsqlDescription } from 'redux/interfaces/ksqlDb'; const clusterName = 'local'; @@ -10,7 +11,7 @@ const renderComponent = ({ accessors, data, }: { - accessors: string[]; + accessors: (keyof KsqlDescription)[]; data: Record<string, string>; }) => { render( @@ -24,7 +25,7 @@ const renderComponent = ({ describe('KsqlDb List Item', () => { it('renders placeholder on one data', async () => { renderComponent({ - accessors: ['accessors'], + accessors: ['accessors' as keyof KsqlDescription], data: { accessors: 'accessors text' }, }); diff --git a/kafka-ui-react-app/src/redux/interfaces/ksqlDb.ts b/kafka-ui-react-app/src/redux/interfaces/ksqlDb.ts index 2290fb3ac88..93c53733681 100644 --- a/kafka-ui-react-app/src/redux/interfaces/ksqlDb.ts +++ b/kafka-ui-react-app/src/redux/interfaces/ksqlDb.ts @@ -1,4 +1,8 @@ -import { KsqlCommandV2Response } from 'generated-sources'; +import { + KsqlCommandV2Response, + KsqlStreamDescription, + KsqlTableDescription, +} from 'generated-sources'; export interface KsqlTables { data: { @@ -8,7 +12,16 @@ export interface KsqlTables { } export interface KsqlState { - tables: Dictionary<string>[]; - streams: Dictionary<string>[]; + tables: KsqlTableDescription[]; + streams: KsqlStreamDescription[]; executionResult: KsqlCommandV2Response | null; } + +export interface KsqlDescription { + type?: string; + name?: string; + topic?: string; + keyFormat?: string; + valueFormat?: string; + isWindowed?: boolean; +} diff --git a/kafka-ui-react-app/src/redux/reducers/ksqlDb/ksqlDbSlice.ts b/kafka-ui-react-app/src/redux/reducers/ksqlDb/ksqlDbSlice.ts index 33f65a81759..84dd356606f 100644 --- a/kafka-ui-react-app/src/redux/reducers/ksqlDb/ksqlDbSlice.ts +++ b/kafka-ui-react-app/src/redux/reducers/ksqlDb/ksqlDbSlice.ts @@ -26,15 +26,13 @@ export const transformKsqlResponse = ( ); const getTables = (clusterName: ClusterName) => - ksqlDbApiClient.executeKsqlCommand({ + ksqlDbApiClient.listTables({ clusterName, - ksqlCommand: { ksql: 'SHOW TABLES;' }, }); const getStreams = (clusterName: ClusterName) => - ksqlDbApiClient.executeKsqlCommand({ + ksqlDbApiClient.listStreams({ clusterName, - ksqlCommand: { ksql: 'SHOW STREAMS;' }, }); export const fetchKsqlDbTables = createAsyncThunk( @@ -45,9 +43,18 @@ export const fetchKsqlDbTables = createAsyncThunk( getStreams(clusterName), ]); + const processedTables = tables.map((table) => ({ + type: 'TABLE', + ...table, + })); + const processedStreams = streams.map((stream) => ({ + type: 'STREAM', + ...stream, + })); + return { - tables: tables.data ? transformKsqlResponse(tables.data) : [], - streams: streams.data ? transformKsqlResponse(streams.data) : [], + tables: processedTables, + streams: processedStreams, }; } );
null
test
train
2022-06-06T10:14:48
"2022-05-31T15:12:39Z"
armenuikafka
train
provectus/kafka-ui/2088_2099
provectus/kafka-ui
provectus/kafka-ui/2088
provectus/kafka-ui/2099
[ "connected" ]
0f0d51d386034d2822f8b8a91496962a31ed647a
c64519c2c1b724306b8a76c2908c586e6e20acd4
[ "Seems works as the same \r\n\r\n\r\nhttps://user-images.githubusercontent.com/104780608/172350115-6edab037-7477-4096-ac07-8d96f73f44c7.mov\r\n\r\n", "@armenuikafka unfortunately Kafka-connect API is asynchronous in case of configuration changes applying. After you edit number of tasks - this change will be applied by kafka-connect asynchronously (and it is not specified when actually). So, I think the best thing we can do is to show UI disclaimer popup like \"Changes will be applied asynchronously, refresh connector's page to see updates\".\r\n\r\ncc @Haarolean " ]
[]
"2022-06-02T11:04:19Z"
[ "type/bug", "good first issue", "status/invalid", "scope/backend", "scope/frontend" ]
Connector't tasks count update is not available without refreshing
**Describe the bug** Connector tasks count update avaIlable only after refresh **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** Steps to reproduce the behavior: 1. Navigate to Kafka connect 2. Open the Connector 3. Edit config 4. Change Topics count 5. Turn to Tasks tab **Expected behavior** Tasks updated count update should be available without refreshing the Connector **Screenshots** https://user-images.githubusercontent.com/104780608/171347945-66ce9bb7-c97c-4b1d-8e50-7bdc5f425d75.mov
[ "kafka-ui-react-app/src/redux/reducers/connect/__test__/reducer.spec.ts", "kafka-ui-react-app/src/redux/reducers/connect/connectSlice.ts" ]
[ "kafka-ui-react-app/src/redux/reducers/connect/__test__/reducer.spec.ts", "kafka-ui-react-app/src/redux/reducers/connect/connectSlice.ts" ]
[]
diff --git a/kafka-ui-react-app/src/redux/reducers/connect/__test__/reducer.spec.ts b/kafka-ui-react-app/src/redux/reducers/connect/__test__/reducer.spec.ts index 3630753b716..61e422d8670 100644 --- a/kafka-ui-react-app/src/redux/reducers/connect/__test__/reducer.spec.ts +++ b/kafka-ui-react-app/src/redux/reducers/connect/__test__/reducer.spec.ts @@ -726,6 +726,7 @@ describe('Connect slice', () => { ); expect(getTypeAndPayload(store)).toEqual([ { type: updateConnectorConfig.pending.type, payload: undefined }, + { type: fetchConnector.pending.type }, ...getAlertActions(store), { type: updateConnectorConfig.fulfilled.type, diff --git a/kafka-ui-react-app/src/redux/reducers/connect/connectSlice.ts b/kafka-ui-react-app/src/redux/reducers/connect/connectSlice.ts index 9402b3c2e8c..4b0299a6191 100644 --- a/kafka-ui-react-app/src/redux/reducers/connect/connectSlice.ts +++ b/kafka-ui-react-app/src/redux/reducers/connect/connectSlice.ts @@ -322,7 +322,7 @@ export const updateConnectorConfig = createAsyncThunk< connectorName, requestBody: connectorConfig, }); - + dispatch(fetchConnector({ clusterName, connectName, connectorName })); dispatch( showSuccessAlert({ id: `connector-${connectorName}-${clusterName}`,
null
train
train
2022-06-06T09:34:32
"2022-06-01T07:21:05Z"
armenuikafka
train
provectus/kafka-ui/2084_2100
provectus/kafka-ui
provectus/kafka-ui/2084
provectus/kafka-ui/2100
[ "timestamp(timedelta=51402.0, similarity=0.8560578912413365)", "connected" ]
c64519c2c1b724306b8a76c2908c586e6e20acd4
7db40577b5745e241071a47696838553fd875ccb
[ "#1961 ", "#2128" ]
[]
"2022-06-02T11:31:28Z"
[ "type/bug", "scope/frontend", "status/accepted", "status/confirmed", "type/regression" ]
Not functional Compare Versions for Schema
**Describe the bug** Compare versions for Schema is not working **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** Steps to reproduce the behavior: 1. Navigate to Schema Registry 2. Open the Schema 3. Press 'Compare Versions' **Expected behavior** - The right path should be displayed - Comparing should be available **Screenshots** https://user-images.githubusercontent.com/104780608/171213052-3ebc5a23-bf15-47bd-b195-893f292c3cc1.mov
[ "kafka-ui-react-app/src/components/Schemas/Details/Details.tsx", "kafka-ui-react-app/src/components/Schemas/Schemas.tsx" ]
[ "kafka-ui-react-app/src/components/Schemas/Details/Details.tsx", "kafka-ui-react-app/src/components/Schemas/Schemas.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/Schemas/Details/Details.tsx b/kafka-ui-react-app/src/components/Schemas/Details/Details.tsx index ae11da54b0c..e54683decbd 100644 --- a/kafka-ui-react-app/src/components/Schemas/Details/Details.tsx +++ b/kafka-ui-react-app/src/components/Schemas/Details/Details.tsx @@ -3,7 +3,7 @@ import { useNavigate } from 'react-router-dom'; import { ClusterSubjectParam, clusterSchemaEditPageRelativePath, - clusterSchemaSchemaDiffRelativePath, + clusterSchemaSchemaDiffPageRelativePath, } from 'lib/paths'; import ClusterContext from 'components/contexts/ClusterContext'; import ConfirmationModal from 'components/common/ConfirmationModal/ConfirmationModal'; @@ -90,7 +90,7 @@ const Details: React.FC = () => { buttonSize="M" buttonType="primary" to={{ - pathname: clusterSchemaSchemaDiffRelativePath, + pathname: clusterSchemaSchemaDiffPageRelativePath, search: `leftVersion=${versions[0]?.version}&rightVersion=${versions[0]?.version}`, }} > diff --git a/kafka-ui-react-app/src/components/Schemas/Schemas.tsx b/kafka-ui-react-app/src/components/Schemas/Schemas.tsx index e98cd333754..45facd46a9e 100644 --- a/kafka-ui-react-app/src/components/Schemas/Schemas.tsx +++ b/kafka-ui-react-app/src/components/Schemas/Schemas.tsx @@ -3,6 +3,7 @@ import { Route, Routes } from 'react-router-dom'; import { clusterSchemaEditRelativePath, clusterSchemaNewRelativePath, + clusterSchemaSchemaDiffRelativePath, RouteParams, } from 'lib/paths'; import List from 'components/Schemas/List/List'; @@ -48,7 +49,7 @@ const Schemas: React.FC = () => { } /> <Route - path={clusterSchemaEditRelativePath} + path={clusterSchemaSchemaDiffRelativePath} element={ <BreadcrumbRoute> <DiffContainer />
null
test
train
2022-06-06T09:41:57
"2022-05-31T15:39:51Z"
armenuikafka
train
provectus/kafka-ui/2085_2104
provectus/kafka-ui
provectus/kafka-ui/2085
provectus/kafka-ui/2104
[ "connected" ]
7db40577b5745e241071a47696838553fd875ccb
657025dfd46cad53a22592e6e0d083191c18b3ad
[]
[]
"2022-06-03T08:16:46Z"
[ "type/bug", "good first issue", "scope/frontend", "status/accepted", "status/confirmed" ]
Recreate Topic is not functional
**Describe the bug** Not possIble to recreate Topic from both: Topics page and Topic profile **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** Steps to reproduce the behavior: 1. Navigate to Topics 2. Press menu icon 3. Select Recreate Topic **Expected behavior** - Success message should appear about recreating - The Topic should be recreated **Screenshots** https://user-images.githubusercontent.com/104780608/171245147-a8e2ff0e-9ad4-4474-abaf-6472515a5041.mov linked to https://github.com/provectus/kafka-ui/issues/1879
[ "kafka-ui-react-app/src/components/common/Icons/DropdownArrowIcon.tsx", "kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts", "kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts" ]
[ "kafka-ui-react-app/src/components/common/Icons/DropdownArrowIcon.tsx", "kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts", "kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts" ]
[]
diff --git a/kafka-ui-react-app/src/components/common/Icons/DropdownArrowIcon.tsx b/kafka-ui-react-app/src/components/common/Icons/DropdownArrowIcon.tsx index 22c783ac55a..f19a9b45ad2 100644 --- a/kafka-ui-react-app/src/components/common/Icons/DropdownArrowIcon.tsx +++ b/kafka-ui-react-app/src/components/common/Icons/DropdownArrowIcon.tsx @@ -13,6 +13,7 @@ const DropdownArrowIcon: React.FC<Props> = ({ isOpen }) => { width="24" height="24" fill="none" + style={{ position: 'absolute', right: '5px' }} stroke="currentColor" strokeWidth="2" color={theme.icons.dropdownArrowIcon} diff --git a/kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts b/kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts index 70d85d46dc4..9a3bc5c8db2 100644 --- a/kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts +++ b/kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts @@ -585,6 +585,18 @@ describe('topics Slice', () => { expect(getTypeAndPayload(store)).toEqual([ { type: recreateTopic.pending.type }, + { type: showSuccessAlert.pending.type }, + { + type: alertAdded.type, + payload: { + id: 'message-topic-local', + title: '', + type: 'success', + createdAt: global.Date.now(), + message: 'Topic successfully recreated!', + }, + }, + { type: showSuccessAlert.fulfilled.type }, { type: recreateTopic.fulfilled.type, payload: { [topicName]: { ...recreateResponse } }, diff --git a/kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts b/kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts index 525f7548419..55c2f267199 100644 --- a/kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts +++ b/kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts @@ -220,9 +220,17 @@ export const deleteTopic = createAsyncThunk< export const recreateTopic = createAsyncThunk< { topic: Topic }, RecreateTopicRequest ->('topic/recreateTopic', async (payload, { rejectWithValue }) => { +>('topic/recreateTopic', async (payload, { rejectWithValue, dispatch }) => { try { + const { topicName, clusterName } = payload; const topic = await topicsApiClient.recreateTopic(payload); + dispatch( + showSuccessAlert({ + id: `message-${topicName}-${clusterName}`, + message: 'Topic successfully recreated!', + }) + ); + return { topic }; } catch (err) { return rejectWithValue(await getResponse(err as Response));
null
train
train
2022-06-06T09:49:30
"2022-05-31T17:46:31Z"
armenuikafka
train
provectus/kafka-ui/1175_2106
provectus/kafka-ui
provectus/kafka-ui/1175
provectus/kafka-ui/2106
[ "connected" ]
4b70cbbde4b0f23dc6d010841fe5020536a23e4e
c1bdbec2b2d9158c9c210f77748d330812e28e2a
[]
[]
"2022-06-03T09:01:02Z"
[ "type/enhancement", "good first issue", "scope/backend", "scope/frontend", "status/accepted" ]
Implement logout button
[ "kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/BasicAuthSecurityConfig.java" ]
[ "kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/BasicAuthSecurityConfig.java" ]
[]
diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/BasicAuthSecurityConfig.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/BasicAuthSecurityConfig.java index 4ee3e53b5b5..6bd56a877fe 100644 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/BasicAuthSecurityConfig.java +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/BasicAuthSecurityConfig.java @@ -1,14 +1,18 @@ package com.provectus.kafka.ui.config.auth; import com.provectus.kafka.ui.util.EmptyRedirectStrategy; +import java.net.URI; import lombok.extern.log4j.Log4j2; import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.security.config.annotation.web.reactive.EnableWebFluxSecurity; +import org.springframework.security.config.web.server.SecurityWebFiltersOrder; import org.springframework.security.config.web.server.ServerHttpSecurity; import org.springframework.security.web.server.SecurityWebFilterChain; import org.springframework.security.web.server.authentication.RedirectServerAuthenticationSuccessHandler; +import org.springframework.security.web.server.authentication.logout.RedirectServerLogoutSuccessHandler; +import org.springframework.security.web.server.ui.LogoutPageGeneratingWebFilter; @Configuration @EnableWebFluxSecurity @@ -16,25 +20,28 @@ @Log4j2 public class BasicAuthSecurityConfig extends AbstractAuthSecurityConfig { + public static final String LOGIN_URL = "/auth"; + public static final String LOGOUT_URL = "/auth?logout"; + @Bean public SecurityWebFilterChain configure(ServerHttpSecurity http) { log.info("Configuring LOGIN_FORM authentication."); - http.authorizeExchange() - .pathMatchers(AUTH_WHITELIST) - .permitAll() - .anyExchange() - .authenticated(); - - final RedirectServerAuthenticationSuccessHandler handler = new RedirectServerAuthenticationSuccessHandler(); - handler.setRedirectStrategy(new EmptyRedirectStrategy()); - - http - .httpBasic().and() - .formLogin() - .loginPage("/auth") - .authenticationSuccessHandler(handler); - - return http.csrf().disable().build(); + + final var authHandler = new RedirectServerAuthenticationSuccessHandler(); + authHandler.setRedirectStrategy(new EmptyRedirectStrategy()); + + final var logoutSuccessHandler = new RedirectServerLogoutSuccessHandler(); + logoutSuccessHandler.setLogoutSuccessUrl(URI.create(LOGOUT_URL)); + + return http + .addFilterAfter(new LogoutPageGeneratingWebFilter(), SecurityWebFiltersOrder.REACTOR_CONTEXT) + .csrf().disable() + .authorizeExchange() + .pathMatchers(AUTH_WHITELIST).permitAll() + .anyExchange().authenticated() + .and().formLogin().loginPage(LOGIN_URL).authenticationSuccessHandler(authHandler) + .and().logout().logoutSuccessHandler(logoutSuccessHandler) + .and().build(); } }
null
val
train
2022-06-03T12:24:13
"2021-12-06T18:58:53Z"
Haarolean
train
provectus/kafka-ui/2113_2115
provectus/kafka-ui
provectus/kafka-ui/2113
provectus/kafka-ui/2115
[ "timestamp(timedelta=0.0, similarity=0.8642288019578304)", "connected" ]
f98c26e4fae6b64cf97ed6061d685f08c021b17e
5fa2bcf5b2d9cd3a90b008d925fd26af82f90647
[]
[]
"2022-06-03T16:12:18Z"
[ "type/bug", "good first issue", "scope/frontend", "status/accepted", "status/confirmed" ]
KSQL table list columns list is broken
<img width="1303" alt="image" src="https://user-images.githubusercontent.com/1494347/171866853-95863ace-cef9-4b41-b903-0bb16adae8f5.png">
[ "kafka-ui-react-app/src/components/KsqlDb/List/List.tsx" ]
[ "kafka-ui-react-app/src/components/KsqlDb/List/List.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/KsqlDb/List/List.tsx b/kafka-ui-react-app/src/components/KsqlDb/List/List.tsx index 8a0e2b42a9c..0e5ccc9facb 100644 --- a/kafka-ui-react-app/src/components/KsqlDb/List/List.tsx +++ b/kafka-ui-react-app/src/components/KsqlDb/List/List.tsx @@ -73,7 +73,6 @@ const List: FC = () => { <Table isFullwidth> <thead> <tr> - <TableHeaderCell title={' '} key="empty cell" /> {headers.map(({ Header, accessor }) => ( <TableHeaderCell title={Header} key={accessor} /> ))}
null
val
train
2022-06-06T23:46:01
"2022-06-03T13:47:29Z"
Haarolean
train
provectus/kafka-ui/2096_2126
provectus/kafka-ui
provectus/kafka-ui/2096
provectus/kafka-ui/2126
[ "connected" ]
070fba4d08c7667b3b21bfb77a4ab3066c91fafb
1ca8873d35af5e3f130026d056aa26d88c0cc643
[ "#1392 " ]
[]
"2022-06-06T09:42:50Z"
[ "type/bug", "scope/frontend", "status/accepted", "status/confirmed" ]
Issue related responsive view of Topic/Schema/Connector profile
**Describe the bug** In case of setting responsive view information is not displayed fully within Topic/Schema/Connector profiles **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** Steps to reproduce the behavior: 1. Navigate to Topic/Schema/Connector profile 2. Make the browser window smaller **Expected behavior** If the window becomes smaller, all the information/buttons should be displayed fully as in default view **Screenshots** https://user-images.githubusercontent.com/104780608/171427320-b4b1ed17-4a2f-465e-b7cb-3b37a0caadc4.mov
[ "kafka-ui-react-app/src/components/Connect/Details/Actions/Actions.tsx", "kafka-ui-react-app/src/components/common/PageHeading/PageHeading.tsx" ]
[ "kafka-ui-react-app/src/components/Connect/Details/Actions/Actions.tsx", "kafka-ui-react-app/src/components/common/PageHeading/PageHeading.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/Connect/Details/Actions/Actions.tsx b/kafka-ui-react-app/src/components/Connect/Details/Actions/Actions.tsx index db26a27deee..4c855859594 100644 --- a/kafka-ui-react-app/src/components/Connect/Details/Actions/Actions.tsx +++ b/kafka-ui-react-app/src/components/Connect/Details/Actions/Actions.tsx @@ -14,6 +14,8 @@ import { Button } from 'components/common/Button/Button'; const ConnectorActionsWrapperStyled = styled.div` display: flex; + flex-wrap: wrap; + align-items: center; gap: 8px; `; diff --git a/kafka-ui-react-app/src/components/common/PageHeading/PageHeading.tsx b/kafka-ui-react-app/src/components/common/PageHeading/PageHeading.tsx index 426532ef4cc..da9903067fb 100644 --- a/kafka-ui-react-app/src/components/common/PageHeading/PageHeading.tsx +++ b/kafka-ui-react-app/src/components/common/PageHeading/PageHeading.tsx @@ -21,11 +21,10 @@ const PageHeading: React.FC<PropsWithChildren<Props>> = ({ }; export default styled(PageHeading)` - height: 56px; display: flex; justify-content: space-between; align-items: center; - padding: 0px 16px; + padding: 16px; & > div { display: flex;
null
train
train
2022-06-06T10:32:47
"2022-06-01T14:25:43Z"
armenuikafka
train
provectus/kafka-ui/2127_2131
provectus/kafka-ui
provectus/kafka-ui/2127
provectus/kafka-ui/2131
[ "connected" ]
2fcb0d1abef83189fbd1cf26167baad3074f1ac4
16ac428610e448486e1f3402cd45def2aa462499
[]
[ "Why not use closeSidebar function? If you want to prevent running useEffect on each render, instead of avoiding using closeSidebar you can wrap closeSidebar function In useCallback. Please, check it.", "fixed", "Should this file be deleted?", "restored" ]
"2022-06-06T11:02:37Z"
[ "type/bug", "scope/frontend", "status/accepted", "status/confirmed" ]
Sidebar sandwich menu click doesn't work
![image](https://user-images.githubusercontent.com/1494347/172137726-51708bf0-cba5-42c7-91bb-fe0f5130fdbb.png)
[ "kafka-ui-react-app/src/components/App.styled.ts", "kafka-ui-react-app/src/components/App.tsx" ]
[ "kafka-ui-react-app/src/components/App.styled.ts", "kafka-ui-react-app/src/components/App.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/App.styled.ts b/kafka-ui-react-app/src/components/App.styled.ts index 64f3296861d..94400cea879 100644 --- a/kafka-ui-react-app/src/components/App.styled.ts +++ b/kafka-ui-react-app/src/components/App.styled.ts @@ -88,7 +88,7 @@ export const Overlay = styled.div<{ $visible: boolean }>( bottom: 0; right: 0; visibility: 'visible'; - opacity: 1; + opacity: 0.7; background-color: ${theme.layout.overlay.backgroundColor}; } `} diff --git a/kafka-ui-react-app/src/components/App.tsx b/kafka-ui-react-app/src/components/App.tsx index c0740f79417..91b74145e44 100644 --- a/kafka-ui-react-app/src/components/App.tsx +++ b/kafka-ui-react-app/src/components/App.tsx @@ -1,4 +1,4 @@ -import React from 'react'; +import React, { useCallback } from 'react'; import { Routes, Route, useLocation } from 'react-router-dom'; import { GIT_TAG, GIT_COMMIT } from 'lib/constants'; import { clusterPath, getNonExactPath } from 'lib/paths'; @@ -25,15 +25,13 @@ const App: React.FC = () => { const areClustersFulfilled = useAppSelector(getAreClustersFulfilled); const clusters = useAppSelector(getClusterList); const [isSidebarVisible, setIsSidebarVisible] = React.useState(false); - const onBurgerClick = () => setIsSidebarVisible(!isSidebarVisible); - const closeSidebar = () => setIsSidebarVisible(false); - + const closeSidebar = useCallback(() => setIsSidebarVisible(false), []); const location = useLocation(); React.useEffect(() => { closeSidebar(); - }, [closeSidebar, location]); + }, [location, closeSidebar]); React.useEffect(() => { dispatch(fetchClusters());
null
train
train
2022-06-08T14:00:19
"2022-06-06T09:47:48Z"
Haarolean
train
provectus/kafka-ui/2111_2137
provectus/kafka-ui
provectus/kafka-ui/2111
provectus/kafka-ui/2137
[ "timestamp(timedelta=0.0, similarity=0.9296936283261605)", "connected" ]
d0e1e2bf6a5e7172bd1f23ed9aacbc1e90831e0d
fac348bb38a517fc2ad856fa161edb36fa727637
[]
[ "What is going on?", "Git or discord?", "Opening a link..", "`<Link to=\"route\" target=\"_blank\"` Something like this will help you to redirect ", "instead of a div in the styled components", "@Mgrdich , this approach doesn't work with external links.", "fixed", "Just try `to={{ pathname: \"https://example.com\" }}`", "@workshur, \r\nthis code add new path to the old\r\n`<Link to={{ pathname: 'https://github.com/provectus/kafka-ui' }}>\r\n <GitIcon />\r\n </Link>`\r\n<img width=\"730\" alt=\"image\" src=\"https://user-images.githubusercontent.com/22497195/172615461-598ae5af-216b-4323-a336-29e424d16434.png\">\r\n" ]
"2022-06-07T12:02:33Z"
[ "type/enhancement", "scope/frontend", "status/accepted" ]
Add social icons in UI
[ "kafka-ui-react-app/src/components/App.styled.ts", "kafka-ui-react-app/src/components/App.tsx", "kafka-ui-react-app/src/theme/theme.ts" ]
[ "kafka-ui-react-app/src/components/App.styled.ts", "kafka-ui-react-app/src/components/App.tsx", "kafka-ui-react-app/src/components/common/Icons/DiscordIcon.tsx", "kafka-ui-react-app/src/components/common/Icons/GitIcon.tsx", "kafka-ui-react-app/src/theme/theme.ts" ]
[]
diff --git a/kafka-ui-react-app/src/components/App.styled.ts b/kafka-ui-react-app/src/components/App.styled.ts index 94400cea879..d854f1f2412 100644 --- a/kafka-ui-react-app/src/components/App.styled.ts +++ b/kafka-ui-react-app/src/components/App.styled.ts @@ -2,6 +2,8 @@ import styled, { css } from 'styled-components'; import { Link } from 'react-router-dom'; import { Button } from './common/Button/Button'; +import GitIcon from './common/Icons/GitIcon'; +import DiscordIcon from './common/Icons/DiscordIcon'; export const Layout = styled.div` min-width: 1200px; @@ -75,7 +77,7 @@ export const Overlay = styled.div<{ $visible: boolean }>( ({ theme, $visible }) => css` height: calc(100vh - ${theme.layout.navBarHeight}); z-index: 99; - visibility: 'hidden'; + visibility: hidden; opacity: 0; -webkit-transition: all 0.5s ease; transition: all 0.5s ease; @@ -87,7 +89,7 @@ export const Overlay = styled.div<{ $visible: boolean }>( @media screen and (max-width: 1023px) { bottom: 0; right: 0; - visibility: 'visible'; + visibility: visible; opacity: 0.7; background-color: ${theme.layout.overlay.backgroundColor}; } @@ -97,6 +99,9 @@ export const Overlay = styled.div<{ $visible: boolean }>( export const Navbar = styled.nav( ({ theme }) => css` + display: flex; + align-items: center; + justify-content: space-between; border-bottom: 1px solid ${theme.layout.stuffBorderColor}; position: fixed; top: 0; @@ -110,13 +115,45 @@ export const Navbar = styled.nav( export const NavbarBrand = styled.div` display: flex; - justify-content: space-between; + justify-content: flex-end; align-items: center !important; flex-shrink: 0; - align-items: stretch; min-height: 3.25rem; `; +export const SocialLink = styled.a( + ({ theme: { layout, icons } }) => css` + display: block; + margin-top: 5px; + cursor: pointer; + fill: ${layout.socialLink.color}; + + &:hover { + ${DiscordIcon} { + fill: ${icons.discord.hover}; + } + ${GitIcon} { + fill: ${icons.git.hover}; + } + } + &:active { + ${DiscordIcon} { + fill: ${icons.discord.active}; + } + ${GitIcon} { + fill: ${icons.git.active}; + } + } + ` +); + +export const NavbarSocial = styled.div` + display: flex; + align-items: center; + gap: 10px; + margin: 10px; +`; + export const NavbarItem = styled.div` display: flex; position: relative; @@ -220,6 +257,6 @@ export const LogoutButton = styled(Button)( export const LogoutLink = styled(Link)( () => css` - margin-right: 16px; + margin-right: 2px; ` ); diff --git a/kafka-ui-react-app/src/components/App.tsx b/kafka-ui-react-app/src/components/App.tsx index 91b74145e44..c26f5514838 100644 --- a/kafka-ui-react-app/src/components/App.tsx +++ b/kafka-ui-react-app/src/components/App.tsx @@ -19,6 +19,8 @@ import { import * as S from './App.styled'; import Logo from './common/Logo/Logo'; +import GitIcon from './common/Icons/GitIcon'; +import DiscordIcon from './common/Icons/DiscordIcon'; const App: React.FC = () => { const dispatch = useAppDispatch(); @@ -64,12 +66,26 @@ const App: React.FC = () => { {GIT_TAG && <Version tag={GIT_TAG} commit={GIT_COMMIT} />} </S.NavbarItem> </S.NavbarBrand> + </S.NavbarBrand> + <S.NavbarSocial> <S.LogoutLink to="/logout"> <S.LogoutButton buttonType="primary" buttonSize="M"> Log out </S.LogoutButton> </S.LogoutLink> - </S.NavbarBrand> + <S.SocialLink + href="https://github.com/provectus/kafka-ui" + target="_blank" + > + <GitIcon /> + </S.SocialLink> + <S.SocialLink + href="https://discord.com/invite/4DWzD7pGE5" + target="_blank" + > + <DiscordIcon /> + </S.SocialLink> + </S.NavbarSocial> </S.Navbar> <S.Container> diff --git a/kafka-ui-react-app/src/components/common/Icons/DiscordIcon.tsx b/kafka-ui-react-app/src/components/common/Icons/DiscordIcon.tsx new file mode 100644 index 00000000000..0df8701b93e --- /dev/null +++ b/kafka-ui-react-app/src/components/common/Icons/DiscordIcon.tsx @@ -0,0 +1,20 @@ +import React from 'react'; +import styled from 'styled-components'; + +const DiscordIcon: React.FC<{ className?: string }> = ({ className }) => ( + <svg + width="22" + height="18" + className={className} + viewBox="0 0 22 18" + xmlns="http://www.w3.org/2000/svg" + > + <path + d="M18.6239 1.60293C17.2217 0.92338 15.7181 0.422718 14.1459 0.135969C14.1173 0.130434 14.0887 0.144265 14.0739 0.171926C13.8805 0.535202 13.6663 1.00913 13.5163 1.38163C11.8254 1.11425 10.1431 1.11425 8.48679 1.38163C8.33676 1.00085 8.11478 0.535202 7.92053 0.171926C7.90578 0.145187 7.87718 0.131357 7.84855 0.135969C6.27725 0.421802 4.7736 0.922464 3.37052 1.60293C3.35838 1.60846 3.34797 1.61769 3.34106 1.62967C0.488942 6.13013 -0.292371 10.52 0.0909151 14.8554C0.0926494 14.8766 0.103922 14.8969 0.119532 14.9098C2.00127 16.3693 3.82406 17.2554 5.61301 17.8428C5.64164 17.852 5.67197 17.8409 5.69019 17.816C6.11337 17.2057 6.49059 16.5621 6.81402 15.8853C6.83311 15.8456 6.81489 15.7986 6.77588 15.7829C6.17754 15.5432 5.6078 15.2509 5.05975 14.919C5.0164 14.8923 5.01293 14.8268 5.05281 14.7954C5.16814 14.7041 5.2835 14.6092 5.39363 14.5133C5.41355 14.4958 5.44131 14.4921 5.46474 14.5031C9.06518 16.2394 12.9631 16.2394 16.521 14.5031C16.5445 14.4912 16.5722 14.4949 16.593 14.5124C16.7032 14.6083 16.8185 14.7041 16.9347 14.7954C16.9746 14.8268 16.972 14.8923 16.9286 14.919C16.3806 15.2574 15.8108 15.5432 15.2116 15.782C15.1726 15.7977 15.1553 15.8456 15.1744 15.8853C15.5047 16.5611 15.882 17.2047 16.2973 17.8151C16.3147 17.8409 16.3459 17.852 16.3745 17.8428C18.1721 17.2554 19.9949 16.3693 21.8766 14.9098C21.8931 14.8969 21.9035 14.8775 21.9053 14.8563C22.364 9.84408 21.1369 5.49024 18.6525 1.63058C18.6465 1.61769 18.6361 1.60846 18.6239 1.60293ZM7.35169 12.2156C6.26771 12.2156 5.37454 11.1645 5.37454 9.8736C5.37454 8.58274 6.25039 7.53164 7.35169 7.53164C8.46163 7.53164 9.34616 8.59197 9.32881 9.8736C9.32881 11.1645 8.45296 12.2156 7.35169 12.2156ZM14.6619 12.2156C13.5779 12.2156 12.6847 11.1645 12.6847 9.8736C12.6847 8.58274 13.5606 7.53164 14.6619 7.53164C15.7718 7.53164 16.6563 8.59197 16.639 9.8736C16.639 11.1645 15.7718 12.2156 14.6619 12.2156Z" + fillRule="evenodd" + clipRule="evenodd" + /> + </svg> +); + +export default styled(DiscordIcon)``; diff --git a/kafka-ui-react-app/src/components/common/Icons/GitIcon.tsx b/kafka-ui-react-app/src/components/common/Icons/GitIcon.tsx new file mode 100644 index 00000000000..daecb611ff2 --- /dev/null +++ b/kafka-ui-react-app/src/components/common/Icons/GitIcon.tsx @@ -0,0 +1,21 @@ +import React from 'react'; +import styled from 'styled-components'; + +const GitIcon: React.FC<{ className?: string }> = ({ className }) => ( + <svg + width="20" + height="20" + className={className} + viewBox="0 0 1024 1024" + xmlns="http://www.w3.org/2000/svg" + > + <path + fillRule="evenodd" + clipRule="evenodd" + d="M8 0C3.58 0 0 3.58 0 8C0 11.54 2.29 14.53 5.47 15.59C5.87 15.66 6.02 15.42 6.02 15.21C6.02 15.02 6.01 14.39 6.01 13.72C4 14.09 3.48 13.23 3.32 12.78C3.23 12.55 2.84 11.84 2.5 11.65C2.22 11.5 1.82 11.13 2.49 11.12C3.12 11.11 3.57 11.7 3.72 11.94C4.44 13.15 5.59 12.81 6.05 12.6C6.12 12.08 6.33 11.73 6.56 11.53C4.78 11.33 2.92 10.64 2.92 7.58C2.92 6.71 3.23 5.99 3.74 5.43C3.66 5.23 3.38 4.41 3.82 3.31C3.82 3.31 4.49 3.1 6.02 4.13C6.66 3.95 7.34 3.86 8.02 3.86C8.7 3.86 9.38 3.95 10.02 4.13C11.55 3.09 12.22 3.31 12.22 3.31C12.66 4.41 12.38 5.23 12.3 5.43C12.81 5.99 13.12 6.7 13.12 7.58C13.12 10.65 11.25 11.33 9.47 11.53C9.76 11.78 10.01 12.26 10.01 13.01C10.01 14.08 10 14.94 10 15.21C10 15.42 10.15 15.67 10.55 15.59C13.71 14.53 16 11.53 16 8C16 3.58 12.42 0 8 0Z" + transform="scale(64)" + /> + </svg> +); + +export default styled(GitIcon)``; diff --git a/kafka-ui-react-app/src/theme/theme.ts b/kafka-ui-react-app/src/theme/theme.ts index c9fd10566a6..478f0ede49f 100644 --- a/kafka-ui-react-app/src/theme/theme.ts +++ b/kafka-ui-react-app/src/theme/theme.ts @@ -33,6 +33,7 @@ export const Colors = { brand: { '5': '#E8E8FC', '10': '#D1D1FA', + '15': '#B8BEF9', '20': '#A3A3F5', '50': '#4C4CFF', '60': '#1717CF', @@ -54,6 +55,8 @@ export const Colors = { '20': '#bbdefb', '30': '#90caf9', '40': '#64b5f6', + '45': '#5865F2', + '50': '#5B67E3', }, }; @@ -67,6 +70,9 @@ const theme = { overlay: { backgroundColor: Colors.neutral[50], }, + socialLink: { + color: Colors.neutral[20], + }, }, panelColor: Colors.neutral[0], breadcrumb: Colors.neutral[30], @@ -493,6 +499,14 @@ const theme = { closeModalIcon: Colors.neutral[25], savedIcon: Colors.brand[50], dropdownArrowIcon: Colors.neutral[30], + git: { + hover: Colors.neutral[70], + active: Colors.neutral[90], + }, + discord: { + hover: Colors.brand[15], + active: Colors.blue[45], + }, }, viewer: { wrapper: Colors.neutral[3],
null
train
train
2022-06-17T14:20:41
"2022-06-03T12:54:11Z"
Haarolean
train
provectus/kafka-ui/2112_2138
provectus/kafka-ui
provectus/kafka-ui/2112
provectus/kafka-ui/2138
[ "connected", "timestamp(timedelta=1.0, similarity=0.8512422007514653)" ]
16ac428610e448486e1f3402cd45def2aa462499
3b69b67c60bc2168b1793496f0360ab1b3e6cabf
[]
[ "What are we want to achieve by using .toString() here", "isWindowed is boolean and values are not being displayed so I converted all displayed values to string" ]
"2022-06-07T12:14:18Z"
[ "type/bug", "good first issue", "scope/frontend", "status/accepted", "status/confirmed" ]
KSQL response: isWindowed is empty
1. run `show tables;` 2. <img width="911" alt="image" src="https://user-images.githubusercontent.com/1494347/171864091-37ef3ed8-549c-429c-a8f5-f041d74d3059.png">
[ "kafka-ui-react-app/src/components/KsqlDb/List/List.tsx", "kafka-ui-react-app/src/components/KsqlDb/List/ListItem.tsx", "kafka-ui-react-app/src/components/KsqlDb/Query/renderer/TableRenderer/TableRenderer.tsx" ]
[ "kafka-ui-react-app/src/components/KsqlDb/List/List.tsx", "kafka-ui-react-app/src/components/KsqlDb/List/ListItem.tsx", "kafka-ui-react-app/src/components/KsqlDb/Query/renderer/TableRenderer/TableRenderer.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/KsqlDb/List/List.tsx b/kafka-ui-react-app/src/components/KsqlDb/List/List.tsx index 0e5ccc9facb..543aaa43f55 100644 --- a/kafka-ui-react-app/src/components/KsqlDb/List/List.tsx +++ b/kafka-ui-react-app/src/components/KsqlDb/List/List.tsx @@ -25,6 +25,7 @@ const headers: HeadersType[] = [ { Header: 'Topic', accessor: 'topic' }, { Header: 'Key Format', accessor: 'keyFormat' }, { Header: 'Value Format', accessor: 'valueFormat' }, + { Header: 'Is Windowed', accessor: 'isWindowed' }, ]; const accessors = headers.map((header) => header.accessor); diff --git a/kafka-ui-react-app/src/components/KsqlDb/List/ListItem.tsx b/kafka-ui-react-app/src/components/KsqlDb/List/ListItem.tsx index 59ec84095e7..e9d073bb5d5 100644 --- a/kafka-ui-react-app/src/components/KsqlDb/List/ListItem.tsx +++ b/kafka-ui-react-app/src/components/KsqlDb/List/ListItem.tsx @@ -11,7 +11,7 @@ const ListItem: React.FC<Props> = ({ accessors, data }) => { return ( <tr> {accessors.map((accessor) => ( - <td key={accessor}>{data[accessor]}</td> + <td key={accessor}>{data[accessor]?.toString()}</td> ))} </tr> ); diff --git a/kafka-ui-react-app/src/components/KsqlDb/Query/renderer/TableRenderer/TableRenderer.tsx b/kafka-ui-react-app/src/components/KsqlDb/Query/renderer/TableRenderer/TableRenderer.tsx index e34b3d25db5..aaf4755b0a2 100644 --- a/kafka-ui-react-app/src/components/KsqlDb/Query/renderer/TableRenderer/TableRenderer.tsx +++ b/kafka-ui-react-app/src/components/KsqlDb/Query/renderer/TableRenderer/TableRenderer.tsx @@ -73,7 +73,7 @@ const TableRenderer: React.FC<Props> = ({ table }) => { rows.map((row) => ( <tr key={row.id}> {row.cells.map((cell) => ( - <td key={cell.id}>{cell.value}</td> + <td key={cell.id}>{cell.value.toString()}</td> ))} </tr> ))
null
train
train
2022-06-09T09:01:53
"2022-06-03T13:46:12Z"
Haarolean
train
provectus/kafka-ui/2016_2139
provectus/kafka-ui
provectus/kafka-ui/2016
provectus/kafka-ui/2139
[ "connected" ]
a4046d46ef44d51626c7f655ef89c3f3429b7b80
3e5093d101a0850a620d417027449c7410889da2
[ "Hello there enshinov! πŸ‘‹\n\nThank you and congratulations πŸŽ‰ for opening your very first issue in this project! πŸ’–\n\nIn case you want to claim this issue, please comment down below! We will try to get back to you as soon as we can. πŸ‘€", "Hey, thanks for reaching out. We'll make it configurable" ]
[ "I'd prefer fully qualified import for lombok's annotation over spring's", "I think we should refactor this code in general: lets\r\n1. make KafkaConnectClients normal object (spring component), CACHE field will become non static. (I would personally rename class to KafkaConnectClientsFactory)\r\n2. inject maxBuffSize value to it, not to KafkaConnectService" ]
"2022-06-07T12:32:39Z"
[ "type/bug", "good first issue", "scope/backend", "status/accepted", "status/confirmed" ]
Failures when Kafka Connectors has high number of tasks (limit on max bytes to buffer : 262144)
**Describe the bug** Stack trace: at org.springframework.web.reactive.function.client.WebClientResponseException.create(WebClientResponseException.java:229) at org.springframework.web.reactive.function.client.DefaultClientResponse.lambda$createException$2(DefaultClientResponse.java:213) at reactor.core.publisher.FluxMap$MapSubscriber.onNext(FluxMap.java:106) at reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onNext(FluxOnErrorResume.java:79) at reactor.core.publisher.Operators$ScalarSubscription.request(Operators.java:2398) at reactor.core.publisher.Operators$MultiSubscriptionSubscriber.set(Operators.java:2194) at reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onSubscribe(FluxOnErrorResume.java:74) at reactor.core.publisher.MonoJust.subscribe(MonoJust.java:55) at reactor.core.publisher.Mono.subscribe(Mono.java:4399) at reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onError(FluxOnErrorResume.java:103) at reactor.core.publisher.Operators$MonoSubscriber.onError(Operators.java:1863) at reactor.core.publisher.FluxMapFuseable$MapFuseableSubscriber.onError(FluxMapFuseable.java:140) at reactor.core.publisher.FluxContextWrite$ContextWriteSubscriber.onError(FluxContextWrite.java:121) at reactor.core.publisher.FluxMapFuseable$MapFuseableConditionalSubscriber.onError(FluxMapFuseable.java:334) at reactor.core.publisher.FluxFilterFuseable$FilterFuseableConditionalSubscriber.onError(FluxFilterFuseable.java:382) at reactor.core.publisher.MonoCollect$CollectSubscriber.onError(MonoCollect.java:144) at reactor.core.publisher.FluxMap$MapSubscriber.onError(FluxMap.java:132) at reactor.core.publisher.FluxPeek$PeekSubscriber.onError(FluxPeek.java:222) at reactor.core.publisher.FluxMap$MapSubscriber.onError(FluxMap.java:132) at reactor.core.publisher.Operators.error(Operators.java:198) at reactor.netty.channel.FluxReceive.startReceiver(FluxReceive.java:182) at reactor.netty.channel.FluxReceive.subscribe(FluxReceive.java:143) at reactor.core.publisher.InternalFluxOperator.subscribe(InternalFluxOperator.java:62) at reactor.netty.ByteBufFlux.subscribe(ByteBufFlux.java:339) at reactor.core.publisher.Mono.subscribe(Mono.java:4399) at reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onError(FluxOnErrorResume.java:103) at reactor.core.publisher.MonoFlatMap$FlatMapMain.onError(MonoFlatMap.java:172) at reactor.core.publisher.FluxContextWrite$ContextWriteSubscriber.onError(FluxContextWrite.java:121) at reactor.core.publisher.FluxMapFuseable$MapFuseableConditionalSubscriber.onError(FluxMapFuseable.java:334) at reactor.core.publisher.FluxFilterFuseable$FilterFuseableConditionalSubscriber.onError(FluxFilterFuseable.java:382) at reactor.core.publisher.MonoCollect$CollectSubscriber.onError(MonoCollect.java:144) at reactor.core.publisher.MonoCollect$CollectSubscriber.onNext(MonoCollect.java:123) at reactor.core.publisher.FluxMap$MapSubscriber.onNext(FluxMap.java:120) at reactor.core.publisher.FluxPeek$PeekSubscriber.onNext(FluxPeek.java:200) at reactor.core.publisher.FluxMap$MapSubscriber.onNext(FluxMap.java:120) at reactor.netty.channel.FluxReceive.onInboundNext(FluxReceive.java:364) at reactor.netty.channel.ChannelOperations.onInboundNext(ChannelOperations.java:404) at reactor.netty.http.client.HttpClientOperations.onInboundNext(HttpClientOperations.java:724) at reactor.netty.channel.ChannelOperationsHandler.channelRead(ChannelOperationsHandler.java:93) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) at io.netty.channel.CombinedChannelDuplexHandler$DelegatingChannelHandlerContext.fireChannelRead(CombinedChannelDuplexHandler.java:436) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:324) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:311) at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:432) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:276) at io.netty.channel.CombinedChannelDuplexHandler.channelRead(CombinedChannelDuplexHandler.java:251) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919) at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:795) at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:480) at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:378) at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986) at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) at java.base/java.lang.Thread.run(Thread.java:830) Caused by: org.springframework.core.io.buffer.DataBufferLimitException: Exceeded limit on max bytes to buffer : 262144 at org.springframework.core.io.buffer.LimitedDataBufferList.raiseLimitException(LimitedDataBufferList.java:99) **Set up** Docker **Steps to Reproduce** Steps to reproduce the behavior: 1. Create ~50-80 elasticsearch tasks (or others) 2. Get some mapping error from elastic **Expected behavior** Should work **Additional context** Will be good to use variable to change **max-in-memory-size**.
[ "kafka-ui-api/src/main/java/com/provectus/kafka/ui/client/KafkaConnectClients.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/client/RetryingKafkaConnectClient.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/KafkaConnectService.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlApiClient.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlServiceV2.java" ]
[ "kafka-ui-api/src/main/java/com/provectus/kafka/ui/client/KafkaConnectClientsFactory.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/client/RetryingKafkaConnectClient.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/KafkaConnectService.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlApiClient.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlServiceV2.java" ]
[ "kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/ksql/KsqlApiClientTest.java", "kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/ksql/KsqlServiceV2Test.java" ]
diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/client/KafkaConnectClients.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/client/KafkaConnectClients.java deleted file mode 100644 index de0c9054ae2..00000000000 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/client/KafkaConnectClients.java +++ /dev/null @@ -1,19 +0,0 @@ -package com.provectus.kafka.ui.client; - -import com.provectus.kafka.ui.connect.api.KafkaConnectClientApi; -import com.provectus.kafka.ui.model.KafkaConnectCluster; -import java.util.Map; -import java.util.concurrent.ConcurrentHashMap; - -public final class KafkaConnectClients { - - private KafkaConnectClients() { - - } - - private static final Map<String, KafkaConnectClientApi> CACHE = new ConcurrentHashMap<>(); - - public static KafkaConnectClientApi withKafkaConnectConfig(KafkaConnectCluster config) { - return CACHE.computeIfAbsent(config.getAddress(), s -> new RetryingKafkaConnectClient(config)); - } -} diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/client/KafkaConnectClientsFactory.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/client/KafkaConnectClientsFactory.java new file mode 100644 index 00000000000..3a5195f5cf0 --- /dev/null +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/client/KafkaConnectClientsFactory.java @@ -0,0 +1,22 @@ +package com.provectus.kafka.ui.client; + +import com.provectus.kafka.ui.connect.api.KafkaConnectClientApi; +import com.provectus.kafka.ui.model.KafkaConnectCluster; +import java.util.Map; +import java.util.concurrent.ConcurrentHashMap; +import org.springframework.beans.factory.annotation.Value; +import org.springframework.stereotype.Service; +import org.springframework.util.unit.DataSize; + +@Service +public class KafkaConnectClientsFactory { + + @Value("${webclient.max-in-memory-buffer-size:20MB}") + private DataSize maxBuffSize; + + private final Map<String, KafkaConnectClientApi> cache = new ConcurrentHashMap<>(); + + public KafkaConnectClientApi withKafkaConnectConfig(KafkaConnectCluster config) { + return cache.computeIfAbsent(config.getAddress(), s -> new RetryingKafkaConnectClient(config, maxBuffSize)); + } +} diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/client/RetryingKafkaConnectClient.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/client/RetryingKafkaConnectClient.java index 70716613730..b1115d0eef1 100644 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/client/RetryingKafkaConnectClient.java +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/client/RetryingKafkaConnectClient.java @@ -1,22 +1,34 @@ package com.provectus.kafka.ui.client; +import com.fasterxml.jackson.databind.DeserializationFeature; +import com.fasterxml.jackson.databind.ObjectMapper; +import com.fasterxml.jackson.datatype.jsr310.JavaTimeModule; import com.provectus.kafka.ui.connect.ApiClient; +import com.provectus.kafka.ui.connect.RFC3339DateFormat; import com.provectus.kafka.ui.connect.api.KafkaConnectClientApi; import com.provectus.kafka.ui.connect.model.Connector; import com.provectus.kafka.ui.connect.model.NewConnector; import com.provectus.kafka.ui.exception.KafkaConnectConflictReponseException; import com.provectus.kafka.ui.exception.ValidationException; import com.provectus.kafka.ui.model.KafkaConnectCluster; +import java.text.DateFormat; import java.time.Duration; import java.util.List; import java.util.Map; +import java.util.TimeZone; import lombok.extern.slf4j.Slf4j; +import org.openapitools.jackson.nullable.JsonNullableModule; import org.springframework.core.ParameterizedTypeReference; import org.springframework.http.HttpHeaders; import org.springframework.http.HttpMethod; import org.springframework.http.MediaType; +import org.springframework.http.codec.json.Jackson2JsonDecoder; +import org.springframework.http.codec.json.Jackson2JsonEncoder; import org.springframework.util.MultiValueMap; +import org.springframework.util.unit.DataSize; import org.springframework.web.client.RestClientException; +import org.springframework.web.reactive.function.client.ExchangeStrategies; +import org.springframework.web.reactive.function.client.WebClient; import org.springframework.web.reactive.function.client.WebClientResponseException; import reactor.core.publisher.Flux; import reactor.core.publisher.Mono; @@ -27,8 +39,8 @@ public class RetryingKafkaConnectClient extends KafkaConnectClientApi { private static final int MAX_RETRIES = 5; private static final Duration RETRIES_DELAY = Duration.ofMillis(200); - public RetryingKafkaConnectClient(KafkaConnectCluster config) { - super(new RetryingApiClient(config)); + public RetryingKafkaConnectClient(KafkaConnectCluster config, DataSize maxBuffSize) { + super(new RetryingApiClient(config, maxBuffSize)); } private static Retry conflictCodeRetry() { @@ -73,13 +85,48 @@ public Mono<Connector> setConnectorConfig(String connectorName, Map<String, Obje private static class RetryingApiClient extends ApiClient { - public RetryingApiClient(KafkaConnectCluster config) { - super(); + private static final DateFormat dateFormat = getDefaultDateFormat(); + private static final ObjectMapper mapper = buildObjectMapper(dateFormat); + + public RetryingApiClient(KafkaConnectCluster config, DataSize maxBuffSize) { + super(buildWebClient(mapper, maxBuffSize), mapper, dateFormat); setBasePath(config.getAddress()); setUsername(config.getUserName()); setPassword(config.getPassword()); } + public static DateFormat getDefaultDateFormat() { + DateFormat dateFormat = new RFC3339DateFormat(); + dateFormat.setTimeZone(TimeZone.getTimeZone("UTC")); + return dateFormat; + } + + public static WebClient buildWebClient(ObjectMapper mapper, DataSize maxBuffSize) { + ExchangeStrategies strategies = ExchangeStrategies + .builder() + .codecs(clientDefaultCodecsConfigurer -> { + clientDefaultCodecsConfigurer.defaultCodecs() + .jackson2JsonEncoder(new Jackson2JsonEncoder(mapper, MediaType.APPLICATION_JSON)); + clientDefaultCodecsConfigurer.defaultCodecs() + .jackson2JsonDecoder(new Jackson2JsonDecoder(mapper, MediaType.APPLICATION_JSON)); + clientDefaultCodecsConfigurer.defaultCodecs() + .maxInMemorySize((int) maxBuffSize.toBytes()); + }) + .build(); + WebClient.Builder webClient = WebClient.builder().exchangeStrategies(strategies); + return webClient.build(); + } + + public static ObjectMapper buildObjectMapper(DateFormat dateFormat) { + ObjectMapper mapper = new ObjectMapper(); + mapper.setDateFormat(dateFormat); + mapper.registerModule(new JavaTimeModule()); + mapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false); + JsonNullableModule jnm = new JsonNullableModule(); + mapper.registerModule(jnm); + return mapper; + } + @Override public <T> Mono<T> invokeAPI(String path, HttpMethod method, Map<String, Object> pathParams, MultiValueMap<String, String> queryParams, Object body, diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/KafkaConnectService.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/KafkaConnectService.java index 21680d3dd91..69fd7a26276 100644 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/KafkaConnectService.java +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/KafkaConnectService.java @@ -2,7 +2,7 @@ import com.fasterxml.jackson.core.type.TypeReference; import com.fasterxml.jackson.databind.ObjectMapper; -import com.provectus.kafka.ui.client.KafkaConnectClients; +import com.provectus.kafka.ui.client.KafkaConnectClientsFactory; import com.provectus.kafka.ui.connect.api.KafkaConnectClientApi; import com.provectus.kafka.ui.connect.model.ConnectorStatus; import com.provectus.kafka.ui.connect.model.ConnectorStatusConnector; @@ -21,7 +21,6 @@ import com.provectus.kafka.ui.model.ConnectorTaskStatusDTO; import com.provectus.kafka.ui.model.FullConnectorInfoDTO; import com.provectus.kafka.ui.model.KafkaCluster; -import com.provectus.kafka.ui.model.KafkaConnectCluster; import com.provectus.kafka.ui.model.NewConnectorDTO; import com.provectus.kafka.ui.model.TaskDTO; import com.provectus.kafka.ui.model.connect.InternalConnectInfo; @@ -51,6 +50,7 @@ public class KafkaConnectService { private final KafkaConnectMapper kafkaConnectMapper; private final ObjectMapper objectMapper; private final KafkaConfigSanitizer kafkaConfigSanitizer; + private final KafkaConnectClientsFactory kafkaConnectClientsFactory; public Mono<Flux<ConnectDTO>> getConnects(KafkaCluster cluster) { return Mono.just( @@ -328,6 +328,6 @@ private Mono<KafkaConnectClientApi> withConnectClient(KafkaCluster cluster, Stri .filter(connect -> connect.getName().equals(connectName)) .findFirst()) .switchIfEmpty(Mono.error(ConnectNotFoundException::new)) - .map(KafkaConnectClients::withKafkaConnectConfig); + .map(kafkaConnectClientsFactory::withKafkaConnectConfig); } } diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlApiClient.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlApiClient.java index 9341dd30d3b..161c284c972 100644 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlApiClient.java +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlApiClient.java @@ -21,6 +21,7 @@ import org.springframework.http.MediaType; import org.springframework.http.codec.json.Jackson2JsonDecoder; import org.springframework.util.MimeTypeUtils; +import org.springframework.util.unit.DataSize; import org.springframework.web.reactive.function.client.ExchangeStrategies; import org.springframework.web.reactive.function.client.WebClient; import org.springframework.web.reactive.function.client.WebClientResponseException; @@ -57,9 +58,11 @@ private static class KsqlRequest { //-------------------------------------------------------------------------------------------- private final KafkaCluster cluster; + private final DataSize maxBuffSize; - public KsqlApiClient(KafkaCluster cluster) { + public KsqlApiClient(KafkaCluster cluster, DataSize maxBuffSize) { this.cluster = cluster; + this.maxBuffSize = maxBuffSize; } private WebClient webClient() { @@ -75,6 +78,7 @@ private WebClient webClient() { }) .build(); return WebClient.builder() + .codecs(c -> c.defaultCodecs().maxInMemorySize((int) maxBuffSize.toBytes())) .exchangeStrategies(exchangeStrategies) .build(); } diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlServiceV2.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlServiceV2.java index 12a2e7d0b24..63fcede3611 100644 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlServiceV2.java +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlServiceV2.java @@ -13,16 +13,23 @@ import java.util.UUID; import java.util.concurrent.TimeUnit; import java.util.stream.Collectors; -import lombok.Value; import lombok.extern.slf4j.Slf4j; +import org.springframework.beans.factory.annotation.Value; import org.springframework.stereotype.Service; +import org.springframework.util.unit.DataSize; import reactor.core.publisher.Flux; @Slf4j @Service public class KsqlServiceV2 { - @Value + private final DataSize maxBuffSize; + + public KsqlServiceV2(@Value("${webclient.max-in-memory-buffer-size:20MB}") DataSize maxBuffSize) { + this.maxBuffSize = maxBuffSize; + } + + @lombok.Value private static class KsqlExecuteCommand { KafkaCluster cluster; String ksql; @@ -48,12 +55,12 @@ public Flux<KsqlResponseTable> execute(String commandId) { throw new ValidationException("No command registered with id " + commandId); } registeredCommands.invalidate(commandId); - return new KsqlApiClient(cmd.cluster) + return new KsqlApiClient(cmd.cluster, maxBuffSize) .execute(cmd.ksql, cmd.streamProperties); } public Flux<KsqlTableDescriptionDTO> listTables(KafkaCluster cluster) { - return new KsqlApiClient(cluster) + return new KsqlApiClient(cluster, maxBuffSize) .execute("LIST TABLES;", Map.of()) .flatMap(resp -> { if (!resp.getHeader().equals("Tables")) { @@ -75,7 +82,7 @@ public Flux<KsqlTableDescriptionDTO> listTables(KafkaCluster cluster) { } public Flux<KsqlStreamDescriptionDTO> listStreams(KafkaCluster cluster) { - return new KsqlApiClient(cluster) + return new KsqlApiClient(cluster, maxBuffSize) .execute("LIST STREAMS;", Map.of()) .flatMap(resp -> { if (!resp.getHeader().equals("Streams")) {
diff --git a/kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/ksql/KsqlApiClientTest.java b/kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/ksql/KsqlApiClientTest.java index d09e32f948c..5956d730829 100644 --- a/kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/ksql/KsqlApiClientTest.java +++ b/kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/ksql/KsqlApiClientTest.java @@ -16,6 +16,7 @@ import org.junit.jupiter.api.AfterAll; import org.junit.jupiter.api.BeforeAll; import org.junit.jupiter.api.Test; +import org.springframework.util.unit.DataSize; import org.testcontainers.shaded.org.awaitility.Awaitility; import org.testcontainers.utility.DockerImageName; import reactor.test.StepVerifier; @@ -26,6 +27,8 @@ class KsqlApiClientTest extends AbstractIntegrationTest { DockerImageName.parse("confluentinc/ksqldb-server").withTag("0.24.0")) .withKafka(kafka); + private static final DataSize maxBuffSize = DataSize.ofMegabytes(20); + @BeforeAll static void startContainer() { KSQL_DB.start(); @@ -39,7 +42,7 @@ static void stopContainer() { // Tutorial is here: https://ksqldb.io/quickstart.html @Test void ksqTutorialQueriesWork() { - var client = new KsqlApiClient(KafkaCluster.builder().ksqldbServer(KSQL_DB.url()).build()); + var client = new KsqlApiClient(KafkaCluster.builder().ksqldbServer(KSQL_DB.url()).build(), maxBuffSize); execCommandSync(client, "CREATE STREAM riderLocations (profileId VARCHAR, latitude DOUBLE, longitude DOUBLE) " + "WITH (kafka_topic='locations', value_format='json', partitions=1);", @@ -126,4 +129,4 @@ private void execCommandSync(KsqlApiClient client, String... ksqls) { } -} \ No newline at end of file +} diff --git a/kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/ksql/KsqlServiceV2Test.java b/kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/ksql/KsqlServiceV2Test.java index 22c87c9aecc..d670680ea21 100644 --- a/kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/ksql/KsqlServiceV2Test.java +++ b/kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/ksql/KsqlServiceV2Test.java @@ -13,6 +13,7 @@ import org.junit.jupiter.api.AfterAll; import org.junit.jupiter.api.BeforeAll; import org.junit.jupiter.api.Test; +import org.springframework.util.unit.DataSize; import org.testcontainers.utility.DockerImageName; class KsqlServiceV2Test extends AbstractIntegrationTest { @@ -24,6 +25,8 @@ class KsqlServiceV2Test extends AbstractIntegrationTest { private static final Set<String> STREAMS_TO_DELETE = new CopyOnWriteArraySet<>(); private static final Set<String> TABLES_TO_DELETE = new CopyOnWriteArraySet<>(); + private static final DataSize maxBuffSize = DataSize.ofMegabytes(20); + @BeforeAll static void init() { KSQL_DB.start(); @@ -31,7 +34,7 @@ static void init() { @AfterAll static void cleanup() { - var client = new KsqlApiClient(KafkaCluster.builder().ksqldbServer(KSQL_DB.url()).build()); + var client = new KsqlApiClient(KafkaCluster.builder().ksqldbServer(KSQL_DB.url()).build(), maxBuffSize); TABLES_TO_DELETE.forEach(t -> client.execute(String.format("DROP TABLE IF EXISTS %s DELETE TOPIC;", t), Map.of()) @@ -44,7 +47,7 @@ static void cleanup() { KSQL_DB.stop(); } - private final KsqlServiceV2 ksqlService = new KsqlServiceV2(); + private final KsqlServiceV2 ksqlService = new KsqlServiceV2(maxBuffSize); @Test void listStreamsReturnsAllKsqlStreams() { @@ -52,7 +55,7 @@ void listStreamsReturnsAllKsqlStreams() { var streamName = "stream_" + System.currentTimeMillis(); STREAMS_TO_DELETE.add(streamName); - new KsqlApiClient(cluster) + new KsqlApiClient(cluster, maxBuffSize) .execute( String.format("CREATE STREAM %s ( " + " c1 BIGINT KEY, " @@ -81,7 +84,7 @@ void listTablesReturnsAllKsqlTables() { var tableName = "table_" + System.currentTimeMillis(); TABLES_TO_DELETE.add(tableName); - new KsqlApiClient(cluster) + new KsqlApiClient(cluster, maxBuffSize) .execute( String.format("CREATE TABLE %s ( " + " c1 BIGINT PRIMARY KEY, " @@ -105,4 +108,4 @@ void listTablesReturnsAllKsqlTables() { ); } -} \ No newline at end of file +}
val
train
2022-06-28T14:15:12
"2022-05-23T11:01:30Z"
enshinov
train
provectus/kafka-ui/2141_2142
provectus/kafka-ui
provectus/kafka-ui/2141
provectus/kafka-ui/2142
[ "timestamp(timedelta=775.0, similarity=0.8614877680241719)", "connected" ]
45dc7f616c7fe192a5e7ca2899f3a6126e2ad06e
f294b1bbad4cc93f07a685f63cad9e71f02eb8a3
[]
[]
"2022-06-07T14:39:01Z"
[ "type/bug", "scope/frontend", "status/accepted", "status/confirmed", "type/regression" ]
Brokers page is broken
**Describe the bug** ChunkLoadError: Loading chunk 317 failed. error appears with navigation to Brokers **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** Steps to reproduce the behavior: 1. Navigate to Brokers **Expected behavior** Brokers info should display without any error **Screenshots** https://user-images.githubusercontent.com/104780608/172392038-4af577fc-e8c7-442c-aa41-2f286f53960a.mov **Additional context** <!-- (Add any other context about the problem here) -->
[ "kafka-ui-react-app/src/components/Cluster/Cluster.tsx" ]
[ "kafka-ui-react-app/src/components/Cluster/Cluster.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/Cluster/Cluster.tsx b/kafka-ui-react-app/src/components/Cluster/Cluster.tsx index e91b8e8317b..f4da3e984a2 100644 --- a/kafka-ui-react-app/src/components/Cluster/Cluster.tsx +++ b/kafka-ui-react-app/src/components/Cluster/Cluster.tsx @@ -18,6 +18,7 @@ import { clusterTopicsRelativePath, getNonExactPath, } from 'lib/paths'; +import Brokers from 'components/Brokers/Brokers'; import Topics from 'components/Topics/Topics'; import Schemas from 'components/Schemas/Schemas'; import Connect from 'components/Connect/Connect'; @@ -29,8 +30,6 @@ import { BreadcrumbRoute } from 'components/common/Breadcrumb/Breadcrumb.route'; import { BreadcrumbProvider } from 'components/common/Breadcrumb/Breadcrumb.provider'; import PageLoader from 'components/common/PageLoader/PageLoader'; -const Brokers = React.lazy(() => import('components/Brokers/Brokers')); - const Cluster: React.FC = () => { const { clusterName } = useAppParams<ClusterNameRoute>(); const isReadOnly = useSelector(getClustersReadonlyStatus(clusterName));
null
train
train
2022-06-07T14:05:58
"2022-06-07T13:29:13Z"
armenuikafka
train
provectus/kafka-ui/2143_2151
provectus/kafka-ui
provectus/kafka-ui/2143
provectus/kafka-ui/2151
[ "timestamp(timedelta=0.0, similarity=0.8694110719603608)", "connected" ]
fac348bb38a517fc2ad856fa161edb36fa727637
3c922bc4706e44f0a6aaa33210b76530e841f56c
[]
[ "It should be in a ..styled.ts file", "replaced", "Please check if we can use our smart table here", "I checked. Yes, we can use Smart Table, but for this we will have to rewrite the entire component. Plus, it will be necessary to add some more code - for example, to describe the state for the table. Should I do it?" ]
"2022-06-10T13:05:06Z"
[ "type/enhancement", "good first issue", "scope/frontend", "status/accepted" ]
Make broker table row clickable
It's not convenient to press on the number, let's make the whole row clickable. <img width="1078" alt="image" src="https://user-images.githubusercontent.com/1494347/172599358-b2a52a90-8e67-40e0-ae95-171dbf94845f.png">
[ "kafka-ui-react-app/src/components/Brokers/BrokersList/BrokersList.tsx", "kafka-ui-react-app/src/components/Brokers/BrokersList/__test__/BrokersList.spec.tsx" ]
[ "kafka-ui-react-app/src/components/Brokers/BrokersList/BrokersList.style.ts", "kafka-ui-react-app/src/components/Brokers/BrokersList/BrokersList.tsx", "kafka-ui-react-app/src/components/Brokers/BrokersList/__test__/BrokersList.spec.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/Brokers/BrokersList/BrokersList.style.ts b/kafka-ui-react-app/src/components/Brokers/BrokersList/BrokersList.style.ts new file mode 100644 index 00000000000..f5f83733331 --- /dev/null +++ b/kafka-ui-react-app/src/components/Brokers/BrokersList/BrokersList.style.ts @@ -0,0 +1,5 @@ +import styled from 'styled-components'; + +export const ClickableRow = styled.tr` + cursor: pointer; +`; diff --git a/kafka-ui-react-app/src/components/Brokers/BrokersList/BrokersList.tsx b/kafka-ui-react-app/src/components/Brokers/BrokersList/BrokersList.tsx index 36e36368c3a..6423891f421 100644 --- a/kafka-ui-react-app/src/components/Brokers/BrokersList/BrokersList.tsx +++ b/kafka-ui-react-app/src/components/Brokers/BrokersList/BrokersList.tsx @@ -1,7 +1,7 @@ import React from 'react'; import { ClusterName } from 'redux/interfaces'; import BytesFormatted from 'components/common/BytesFormatted/BytesFormatted'; -import { NavLink } from 'react-router-dom'; +import { NavLink, useNavigate } from 'react-router-dom'; import TableHeaderCell from 'components/common/table/TableHeaderCell/TableHeaderCell'; import { Table } from 'components/common/table/Table/Table.styled'; import PageHeading from 'components/common/PageHeading/PageHeading'; @@ -10,7 +10,10 @@ import useAppParams from 'lib/hooks/useAppParams'; import useBrokers from 'lib/hooks/useBrokers'; import useClusterStats from 'lib/hooks/useClusterStats'; +import { ClickableRow } from './BrokersList.style'; + const BrokersList: React.FC = () => { + const navigate = useNavigate(); const { clusterName } = useAppParams<{ clusterName: ClusterName }>(); const { data: clusterStats } = useClusterStats(clusterName); const { data: brokers } = useBrokers(clusterName); @@ -58,7 +61,6 @@ const BrokersList: React.FC = () => { onlinePartitionCount )} <Metrics.LightText> - {' '} of {(onlinePartitionCount || 0) + (offlinePartitionCount || 0)} </Metrics.LightText> </Metrics.Indicator> @@ -114,8 +116,12 @@ const BrokersList: React.FC = () => { diskUsage.length !== 0 && diskUsage.map(({ brokerId, segmentSize, segmentCount }) => { const brokerItem = brokers?.find(({ id }) => id === brokerId); + return ( - <tr key={brokerId}> + <ClickableRow + key={brokerId} + onClick={() => navigate(`${brokerId}`)} + > <td> <NavLink to={`${brokerId}`} role="link"> {brokerId} @@ -127,7 +133,7 @@ const BrokersList: React.FC = () => { <td>{segmentCount}</td> <td>{brokerItem?.port}</td> <td>{brokerItem?.host}</td> - </tr> + </ClickableRow> ); })} </tbody> diff --git a/kafka-ui-react-app/src/components/Brokers/BrokersList/__test__/BrokersList.spec.tsx b/kafka-ui-react-app/src/components/Brokers/BrokersList/__test__/BrokersList.spec.tsx index 437cc44cd4c..94ffa0b77af 100644 --- a/kafka-ui-react-app/src/components/Brokers/BrokersList/__test__/BrokersList.spec.tsx +++ b/kafka-ui-react-app/src/components/Brokers/BrokersList/__test__/BrokersList.spec.tsx @@ -9,6 +9,14 @@ import { brokersPayload, clusterStatsPayload, } from 'components/Brokers/__test__/fixtures'; +import userEvent from '@testing-library/user-event'; + +const mockedUsedNavigate = jest.fn(); + +jest.mock('react-router-dom', () => ({ + ...jest.requireActual('react-router-dom'), + useNavigate: () => mockedUsedNavigate, +})); describe('BrokersList Component', () => { afterEach(() => fetchMock.reset()); @@ -53,6 +61,22 @@ describe('BrokersList Component', () => { expect(rows.length).toEqual(3); }); + it('opens broker when row clicked', async () => { + const fetchStatsMock = fetchMock.get(fetchStatsUrl, clusterStatsPayload); + await act(() => { + renderComponent(); + }); + await waitFor(() => expect(fetchStatsMock.called()).toBeTruthy()); + await act(() => { + userEvent.click(screen.getByRole('cell', { name: '0' })); + }); + + await waitFor(() => { + expect(mockedUsedNavigate).toBeCalled(); + expect(mockedUsedNavigate).toBeCalledWith('0'); + }); + }); + it('shows warning when offlinePartitionCount > 0', async () => { const fetchStatsMock = fetchMock.getOnce(fetchStatsUrl, { ...clusterStatsPayload, @@ -93,6 +117,7 @@ describe('BrokersList Component', () => { expect(onlineWidgetDef).toBeInTheDocument(); expect(onlineWidget).toBeInTheDocument(); }); + it('shows right count when inSyncReplicasCount: undefined outOfSyncReplicasCount: 1', async () => { const fetchStatsMock = fetchMock.getOnce(fetchStatsUrl, { ...clusterStatsPayload,
null
train
train
2022-06-21T14:23:28
"2022-06-08T10:53:54Z"
Haarolean
train
provectus/kafka-ui/2140_2161
provectus/kafka-ui
provectus/kafka-ui/2140
provectus/kafka-ui/2161
[ "timestamp(timedelta=1.0, similarity=1.0)", "connected" ]
4ebc74212cea7741dd68362881430517322fd018
d859dd6b3f4bcdcc9143a2e02695eca95be0fd9f
[ "\"partition\" selector value is not sent unless specified explicitly by user\r\nthe same behavior for #1725" ]
[]
"2022-06-14T10:24:06Z"
[ "type/bug", "good first issue", "scope/frontend", "status/accepted", "status/confirmed" ]
Not possible to add message for a topic
**Describe the bug** Error appears with creating a message for a Topic **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** Steps to reproduce the behavior: 1. Navigate to Topic profile 2. Turn to Messages tab 3. Produce message 4. Add Key/Content/Headers 5. Press Send **Expected behavior** Message should be added without any error in a console and error message for ui. **Screenshots** https://user-images.githubusercontent.com/104780608/172389686-f2be6421-b083-4984-9e99-eb7dee910643.mov
[ "kafka-ui-react-app/src/components/Topics/Topic/SendMessage/SendMessage.tsx" ]
[ "kafka-ui-react-app/src/components/Topics/Topic/SendMessage/SendMessage.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/Topics/Topic/SendMessage/SendMessage.tsx b/kafka-ui-react-app/src/components/Topics/Topic/SendMessage/SendMessage.tsx index c8898c7ad27..11fa6c08e3e 100644 --- a/kafka-ui-react-app/src/components/Topics/Topic/SendMessage/SendMessage.tsx +++ b/kafka-ui-react-app/src/components/Topics/Topic/SendMessage/SendMessage.tsx @@ -142,7 +142,7 @@ const SendMessage: React.FC = () => { key: !key ? null : key, content: !content ? null : content, headers, - partition, + partition: !partition ? 0 : partition, }, }); dispatch(fetchTopicDetails({ clusterName, topicName }));
null
test
train
2022-06-14T22:15:03
"2022-06-07T13:19:26Z"
armenuikafka
train
provectus/kafka-ui/1175_2170
provectus/kafka-ui
provectus/kafka-ui/1175
provectus/kafka-ui/2170
[ "timestamp(timedelta=1.0, similarity=0.846415028433612)", "connected" ]
aa839b4d69106e50500d4a80ac4e65a26c9c8bc7
9c7f078dedf40e1d87a21a4434aeff9466f12f53
[]
[]
"2022-06-15T11:02:12Z"
[ "type/enhancement", "good first issue", "scope/backend", "scope/frontend", "status/accepted" ]
Implement logout button
[ "kafka-ui-react-app/src/components/App.styled.ts", "kafka-ui-react-app/src/components/App.tsx" ]
[ "kafka-ui-react-app/src/components/App.styled.ts", "kafka-ui-react-app/src/components/App.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/App.styled.ts b/kafka-ui-react-app/src/components/App.styled.ts index d854f1f2412..cd79fe62497 100644 --- a/kafka-ui-react-app/src/components/App.styled.ts +++ b/kafka-ui-react-app/src/components/App.styled.ts @@ -255,7 +255,7 @@ export const LogoutButton = styled(Button)( ` ); -export const LogoutLink = styled(Link)( +export const LogoutLink = styled.a( () => css` margin-right: 2px; ` diff --git a/kafka-ui-react-app/src/components/App.tsx b/kafka-ui-react-app/src/components/App.tsx index c26f5514838..1035edb05d6 100644 --- a/kafka-ui-react-app/src/components/App.tsx +++ b/kafka-ui-react-app/src/components/App.tsx @@ -68,7 +68,7 @@ const App: React.FC = () => { </S.NavbarBrand> </S.NavbarBrand> <S.NavbarSocial> - <S.LogoutLink to="/logout"> + <S.LogoutLink href="/logout"> <S.LogoutButton buttonType="primary" buttonSize="M"> Log out </S.LogoutButton>
null
test
train
2022-06-22T09:48:03
"2021-12-06T18:58:53Z"
Haarolean
train
provectus/kafka-ui/2144_2171
provectus/kafka-ui
provectus/kafka-ui/2144
provectus/kafka-ui/2171
[ "keyword_pr_to_issue" ]
d859dd6b3f4bcdcc9143a2e02695eca95be0fd9f
41fd765d8305a3595f684b5bd0da229fafc7c8dd
[ "Hello, I am newbies and I am willing to help.\r\n\r\nI checked the code, the Label box is hardcoded in 50px. Is this you are referring as broken UI?\r\n\r\n[ width: 50px;](https://github.com/provectus/kafka-ui/blob/e4dc1134abe45ee72c10d454eb7e4e326c52e194/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/MessageContent/MessageContent.styled.ts#L59)\r\n", "@ymlai87416 hi and welcome! Feel free to tackle this.\r\nYes, \"timestamp\" and \"content\" labels are being wrapped." ]
[]
"2022-06-15T12:58:16Z"
[ "type/bug", "good first issue", "scope/frontend", "status/accepted", "status/confirmed" ]
Topic message info layout is broken
<img width="416" alt="image" src="https://user-images.githubusercontent.com/1494347/172599553-28a0079f-ac6c-443d-82e1-e2d6721bfa7c.png">
[ "kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/MessageContent/MessageContent.styled.ts" ]
[ "kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/MessageContent/MessageContent.styled.ts" ]
[]
diff --git a/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/MessageContent/MessageContent.styled.ts b/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/MessageContent/MessageContent.styled.ts index de0967b4ab8..2f3efd7b537 100644 --- a/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/MessageContent/MessageContent.styled.ts +++ b/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/MessageContent/MessageContent.styled.ts @@ -56,7 +56,7 @@ export const Metadata = styled.span` export const MetadataLabel = styled.p` color: ${({ theme }) => theme.topicMetaData.color.label}; font-size: 14px; - width: 50px; + width: 80px; `; export const MetadataValue = styled.p`
null
train
train
2022-06-15T14:56:47
"2022-06-08T10:55:28Z"
Haarolean
train
provectus/kafka-ui/2079_2187
provectus/kafka-ui
provectus/kafka-ui/2079
provectus/kafka-ui/2187
[ "timestamp(timedelta=1.0, similarity=0.8984928089324661)", "connected" ]
46bcbb3436caf7357ff11eebbd1b49fe4f2cd167
a77869783b1aa27de5d5a71e5614079fc48c95d9
[]
[]
"2022-06-20T08:07:46Z"
[ "type/bug", "scope/frontend", "status/accepted", "status/confirmed", "type/regression" ]
Redirection issue from Connector profile's path to Dashboard instead of Connectors page
**Describe the bug** With click on Connectors from Connector path, system redirects to Dashboard instead of Connectors **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** Steps to reproduce the behavior: 1. Navigate to Kafka Connect 2. Open any Connector 3. Press on Connectors from the path **Expected behavior** Connectors page should open **Screenshots** https://user-images.githubusercontent.com/104780608/171187811-10da45ee-ba17-468a-95b6-5aaf609be1f5.mov **Additional context** linked to https://github.com/provectus/kafka-ui/issues/1961
[ "kafka-ui-react-app/src/components/Connect/Connect.tsx" ]
[ "kafka-ui-react-app/src/components/Connect/Connect.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/Connect/Connect.tsx b/kafka-ui-react-app/src/components/Connect/Connect.tsx index f1bb7ef4066..c4c94227e6e 100644 --- a/kafka-ui-react-app/src/components/Connect/Connect.tsx +++ b/kafka-ui-react-app/src/components/Connect/Connect.tsx @@ -7,57 +7,63 @@ import { clusterConnectConnectorsRelativePath, clusterConnectorNewRelativePath, getNonExactPath, + clusterConnectorsPath, } from 'lib/paths'; import { BreadcrumbRoute } from 'components/common/Breadcrumb/Breadcrumb.route'; +import useAppParams from 'lib/hooks/useAppParams'; import ListContainer from './List/ListContainer'; import NewContainer from './New/NewContainer'; import DetailsContainer from './Details/DetailsContainer'; import EditContainer from './Edit/EditContainer'; -const Connect: React.FC = () => ( - <Routes> - <Route - index - element={ - <BreadcrumbRoute> - <ListContainer /> - </BreadcrumbRoute> - } - /> - <Route - path={clusterConnectorNewRelativePath} - element={ - <BreadcrumbRoute> - <NewContainer /> - </BreadcrumbRoute> - } - /> - <Route - path={clusterConnectConnectorEditRelativePath} - element={ - <BreadcrumbRoute> - <EditContainer /> - </BreadcrumbRoute> - } - /> - <Route - path={getNonExactPath(clusterConnectConnectorRelativePath)} - element={ - <BreadcrumbRoute> - <DetailsContainer /> - </BreadcrumbRoute> - } - /> - <Route - path={clusterConnectConnectorsRelativePath} - element={<Navigate to="/" replace />} - /> - <Route - path={RouteParams.connectName} - element={<Navigate to="/" replace />} - /> - </Routes> -); +const Connect: React.FC = () => { + const { clusterName } = useAppParams(); + + return ( + <Routes> + <Route + index + element={ + <BreadcrumbRoute> + <ListContainer /> + </BreadcrumbRoute> + } + /> + <Route + path={clusterConnectorNewRelativePath} + element={ + <BreadcrumbRoute> + <NewContainer /> + </BreadcrumbRoute> + } + /> + <Route + path={clusterConnectConnectorEditRelativePath} + element={ + <BreadcrumbRoute> + <EditContainer /> + </BreadcrumbRoute> + } + /> + <Route + path={getNonExactPath(clusterConnectConnectorRelativePath)} + element={ + <BreadcrumbRoute> + <DetailsContainer /> + </BreadcrumbRoute> + } + /> + <Route + path={clusterConnectConnectorsRelativePath} + element={<Navigate to={clusterConnectorsPath(clusterName)} replace />} + /> + <Route + path={RouteParams.connectName} + element={<Navigate to="/" replace />} + /> + </Routes> + ); +}; export default Connect;
null
train
train
2022-06-29T11:40:50
"2022-05-31T13:49:57Z"
armenuikafka
train
provectus/kafka-ui/1962_2191
provectus/kafka-ui
provectus/kafka-ui/1962
provectus/kafka-ui/2191
[ "connected" ]
cbd4e4a52adf8ca7b15c84a5c18331d6359eb51e
0b76b1251859f7acc4daf4afd746b81945b7c720
[]
[ "?", "Can we use `useFieldArray`?", "Timeout?", "does userEvent not work here?", "does userEvent not work here?", "does userEvent not work here?", "seems there is a problem related to fieldset tag, via userevent.paste we are getting error in tests but fireevent.paste is working without issues, if there is no mandatory we can let it be fireevent", "fixed using waitFor", "returned back validation but fields are not required for now, will add it when everything will work properly", "done" ]
"2022-06-21T07:40:00Z"
[ "type/enhancement", "scope/frontend", "status/accepted" ]
Implement a view to set KSQL stream properties
string-string key-value instead, format as json
[ "kafka-ui-react-app/src/components/KsqlDb/Query/Query.tsx", "kafka-ui-react-app/src/components/KsqlDb/Query/QueryForm/QueryForm.styled.ts", "kafka-ui-react-app/src/components/KsqlDb/Query/QueryForm/QueryForm.tsx", "kafka-ui-react-app/src/components/KsqlDb/Query/QueryForm/__test__/QueryForm.spec.tsx", "kafka-ui-react-app/src/components/KsqlDb/Query/__test__/Query.spec.tsx" ]
[ "kafka-ui-react-app/src/components/KsqlDb/Query/Query.tsx", "kafka-ui-react-app/src/components/KsqlDb/Query/QueryForm/QueryForm.styled.ts", "kafka-ui-react-app/src/components/KsqlDb/Query/QueryForm/QueryForm.tsx", "kafka-ui-react-app/src/components/KsqlDb/Query/QueryForm/__test__/QueryForm.spec.tsx", "kafka-ui-react-app/src/components/KsqlDb/Query/__test__/Query.spec.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/KsqlDb/Query/Query.tsx b/kafka-ui-react-app/src/components/KsqlDb/Query/Query.tsx index 4982807ac24..c5757e2cae6 100644 --- a/kafka-ui-react-app/src/components/KsqlDb/Query/Query.tsx +++ b/kafka-ui-react-app/src/components/KsqlDb/Query/Query.tsx @@ -198,15 +198,23 @@ const Query: FC = () => { const submitHandler = useCallback( (values: FormValues) => { + const streamsProperties = values.streamsProperties.reduce( + (acc, current) => ({ + ...acc, + [current.key as keyof string]: current.value, + }), + {} as { [key: string]: string } + ); setFetching(true); dispatch( executeKsql({ clusterName, ksqlCommandV2: { ...values, - streamsProperties: values.streamsProperties - ? JSON.parse(values.streamsProperties) - : undefined, + streamsProperties: + values.streamsProperties[0].key !== '' + ? JSON.parse(JSON.stringify(streamsProperties)) + : undefined, }, }) ); diff --git a/kafka-ui-react-app/src/components/KsqlDb/Query/QueryForm/QueryForm.styled.ts b/kafka-ui-react-app/src/components/KsqlDb/Query/QueryForm/QueryForm.styled.ts index 980fa0c2165..202e5862907 100644 --- a/kafka-ui-react-app/src/components/KsqlDb/Query/QueryForm/QueryForm.styled.ts +++ b/kafka-ui-react-app/src/components/KsqlDb/Query/QueryForm/QueryForm.styled.ts @@ -27,8 +27,47 @@ export const KSQLButtons = styled.div` gap: 16px; `; +export const StreamPropertiesContainer = styled.label` + display: flex; + flex-direction: column; + gap: 10px; + width: 50%; +`; + +export const InputsContainer = styled.div` + display: flex; + justify-content: center; + gap: 10px; +`; + +export const StreamPropertiesInputWrapper = styled.div` + & > input { + height: 40px; + border: 1px solid grey; + border-radius: 4px; + min-width: 300px; + font-size: 16px; + padding-left: 15px; + } +`; + +export const DeleteButtonWrapper = styled.div` + min-height: 32px; + display: flex; + flex-direction: column; + align-items: center; + justify-self: flex-start; + margin-top: 10px; +`; + +export const LabelContainer = styled.div` + display: flex; + align-items: center; + justify-content: space-between; +`; + export const Fieldset = styled.fieldset` - width: 100%; + width: 50%; `; export const Editor = styled(BaseEditor)( diff --git a/kafka-ui-react-app/src/components/KsqlDb/Query/QueryForm/QueryForm.tsx b/kafka-ui-react-app/src/components/KsqlDb/Query/QueryForm/QueryForm.tsx index f824dd256a7..a4d2a275bea 100644 --- a/kafka-ui-react-app/src/components/KsqlDb/Query/QueryForm/QueryForm.tsx +++ b/kafka-ui-react-app/src/components/KsqlDb/Query/QueryForm/QueryForm.tsx @@ -1,11 +1,12 @@ import React from 'react'; import { FormError } from 'components/common/Input/Input.styled'; import { ErrorMessage } from '@hookform/error-message'; +import { useForm, Controller, useFieldArray } from 'react-hook-form'; +import { Button } from 'components/common/Button/Button'; +import IconButtonWrapper from 'components/common/Icons/IconButtonWrapper'; +import CloseIcon from 'components/common/Icons/CloseIcon'; import { yupResolver } from '@hookform/resolvers/yup'; import yup from 'lib/yupExtended'; -import { useForm, Controller } from 'react-hook-form'; -import { Button } from 'components/common/Button/Button'; -import { SchemaType } from 'generated-sources'; import * as S from './QueryForm.styled'; @@ -17,16 +18,22 @@ export interface Props { submitHandler: (values: FormValues) => void; } +export type StreamsPropertiesType = { + key: string; + value: string; +}; export type FormValues = { ksql: string; - streamsProperties: string; + streamsProperties: StreamsPropertiesType[]; }; +const streamsPropertiesSchema = yup.object().shape({ + key: yup.string().trim(), + value: yup.string().trim(), +}); const validationSchema = yup.object({ ksql: yup.string().trim().required(), - streamsProperties: yup.lazy((value) => - value === '' ? yup.string().trim() : yup.string().trim().isJsonObject() - ), + streamsProperties: yup.array().of(streamsPropertiesSchema), }); const QueryForm: React.FC<Props> = ({ @@ -46,9 +53,16 @@ const QueryForm: React.FC<Props> = ({ resolver: yupResolver(validationSchema), defaultValues: { ksql: '', - streamsProperties: '', + streamsProperties: [{ key: '', value: '' }], }, }); + const { fields, append, remove } = useFieldArray< + FormValues, + 'streamsProperties' + >({ + control, + name: 'streamsProperties', + }); return ( <S.QueryWrapper> @@ -93,48 +107,69 @@ const QueryForm: React.FC<Props> = ({ <ErrorMessage errors={errors} name="ksql" /> </FormError> </S.Fieldset> - <S.Fieldset aria-labelledby="streamsPropertiesLabel"> - <S.KSQLInputHeader> - <label id="streamsPropertiesLabel"> - Stream properties (JSON format) - </label> - <Button - onClick={() => setValue('streamsProperties', '')} - buttonType="primary" - buttonSize="S" - isInverted - > - Clear - </Button> - </S.KSQLInputHeader> - <Controller - control={control} - name="streamsProperties" - render={({ field }) => ( - <S.Editor - {...field} - commands={[ - { - // commands is array of key bindings. - // name for the key binding. - name: 'commandName', - // key combination used for the command. - bindKey: { win: 'Ctrl-Enter', mac: 'Command-Enter' }, - // function to execute when keys are pressed. - exec: () => { - handleSubmit(submitHandler)(); - }, - }, - ]} - schemaType={SchemaType.JSON} - readOnly={fetching} - /> - )} - /> - <FormError> - <ErrorMessage errors={errors} name="streamsProperties" /> - </FormError> - </S.Fieldset> + + <S.StreamPropertiesContainer> + Stream properties: + {fields.map((item, index) => ( + <S.InputsContainer key={item.id}> + <S.StreamPropertiesInputWrapper> + <Controller + control={control} + name={`streamsProperties.${index}.key`} + render={({ field }) => ( + <input + {...field} + placeholder="Key" + aria-label="key" + type="text" + /> + )} + /> + <FormError> + <ErrorMessage + errors={errors} + name={`streamsProperties.${index}.key`} + /> + </FormError> + </S.StreamPropertiesInputWrapper> + <S.StreamPropertiesInputWrapper> + <Controller + control={control} + name={`streamsProperties.${index}.value`} + render={({ field }) => ( + <input + {...field} + placeholder="Value" + aria-label="value" + type="text" + /> + )} + /> + <FormError> + <ErrorMessage + errors={errors} + name={`streamsProperties.${index}.value`} + /> + </FormError> + </S.StreamPropertiesInputWrapper> + + <S.DeleteButtonWrapper onClick={() => remove(index)}> + <IconButtonWrapper aria-label="deleteProperty"> + <CloseIcon aria-hidden /> + </IconButtonWrapper> + </S.DeleteButtonWrapper> + </S.InputsContainer> + ))} + <Button + type="button" + buttonSize="M" + buttonType="secondary" + onClick={() => append({ key: '', value: '' })} + > + <i className="fas fa-plus" /> + Add Stream Property + </Button> + </S.StreamPropertiesContainer> </S.KSQLInputsWrapper> <S.KSQLButtons> <Button diff --git a/kafka-ui-react-app/src/components/KsqlDb/Query/QueryForm/__test__/QueryForm.spec.tsx b/kafka-ui-react-app/src/components/KsqlDb/Query/QueryForm/__test__/QueryForm.spec.tsx index 09f61cccb4a..12a61a5fa5f 100644 --- a/kafka-ui-react-app/src/components/KsqlDb/Query/QueryForm/__test__/QueryForm.spec.tsx +++ b/kafka-ui-react-app/src/components/KsqlDb/Query/QueryForm/__test__/QueryForm.spec.tsx @@ -1,7 +1,7 @@ import { render } from 'lib/testHelpers'; import React from 'react'; import QueryForm, { Props } from 'components/KsqlDb/Query/QueryForm/QueryForm'; -import { screen, within } from '@testing-library/dom'; +import { screen, waitFor, within } from '@testing-library/dom'; import userEvent from '@testing-library/user-event'; import { act } from '@testing-library/react'; @@ -26,20 +26,11 @@ describe('QueryForm', () => { // Represents SQL editor expect(within(KSQLBlock).getByRole('textbox')).toBeInTheDocument(); - const streamPropertiesBlock = screen.getByLabelText( - 'Stream properties (JSON format)' - ); + const streamPropertiesBlock = screen.getByRole('textbox', { name: 'key' }); expect(streamPropertiesBlock).toBeInTheDocument(); - expect( - within(streamPropertiesBlock).getByText('Stream properties (JSON format)') - ).toBeInTheDocument(); - expect( - within(streamPropertiesBlock).getByRole('button', { name: 'Clear' }) - ).toBeInTheDocument(); - // Represents JSON editor - expect( - within(streamPropertiesBlock).getByRole('textbox') - ).toBeInTheDocument(); + expect(screen.getByText('Stream properties:')).toBeInTheDocument(); + expect(screen.getByRole('button', { name: 'Clear' })).toBeInTheDocument(); + expect(screen.queryAllByRole('textbox')[0]).toBeInTheDocument(); // Form controls expect(screen.getByRole('button', { name: 'Execute' })).toBeInTheDocument(); @@ -69,58 +60,10 @@ describe('QueryForm', () => { await act(() => userEvent.click(screen.getByRole('button', { name: 'Execute' })) ); - expect(screen.getByText('ksql is a required field')).toBeInTheDocument(); - expect(submitFn).not.toBeCalled(); - }); - - it('renders error with non-JSON streamProperties', async () => { - renderComponent({ - fetching: false, - hasResults: false, - handleClearResults: jest.fn(), - handleSSECancel: jest.fn(), - submitHandler: jest.fn(), - }); - - await act(() => { - // the use of `paste` is a hack that i found somewhere, - // `type` won't work - userEvent.paste( - within( - screen.getByLabelText('Stream properties (JSON format)') - ).getByRole('textbox'), - 'not-a-JSON-string' - ); - - userEvent.click(screen.getByRole('button', { name: 'Execute' })); + waitFor(() => { + expect(screen.getByText('ksql is a required field')).toBeInTheDocument(); + expect(submitFn).not.toBeCalled(); }); - - expect( - screen.getByText('streamsProperties is not JSON object') - ).toBeInTheDocument(); - }); - - it('renders without error with correct JSON', async () => { - renderComponent({ - fetching: false, - hasResults: false, - handleClearResults: jest.fn(), - handleSSECancel: jest.fn(), - submitHandler: jest.fn(), - }); - - await act(() => { - userEvent.paste( - within( - screen.getByLabelText('Stream properties (JSON format)') - ).getByRole('textbox'), - '{"totallyJSON": "string"}' - ); - userEvent.click(screen.getByRole('button', { name: 'Execute' })); - }); - expect( - screen.queryByText('streamsProperties is not JSON object') - ).not.toBeInTheDocument(); }); it('submits with correct inputs', async () => { @@ -134,18 +77,9 @@ describe('QueryForm', () => { }); await act(() => { - userEvent.paste( - within(screen.getByLabelText('KSQL')).getByRole('textbox'), - 'show tables;' - ); - - userEvent.paste( - within( - screen.getByLabelText('Stream properties (JSON format)') - ).getByRole('textbox'), - '{"totallyJSON": "string"}' - ); - + userEvent.paste(screen.getAllByRole('textbox')[0], 'show tables;'); + userEvent.paste(screen.getByRole('textbox', { name: 'key' }), 'test'); + userEvent.paste(screen.getByRole('textbox', { name: 'value' }), 'test'); userEvent.click(screen.getByRole('button', { name: 'Execute' })); }); @@ -223,41 +157,7 @@ describe('QueryForm', () => { expect(submitFn.mock.calls.length).toBe(1); }); - it('submits form with ctrl+enter on streamProperties editor', async () => { - const submitFn = jest.fn(); - renderComponent({ - fetching: false, - hasResults: false, - handleClearResults: jest.fn(), - handleSSECancel: jest.fn(), - submitHandler: submitFn, - }); - - await act(() => { - userEvent.paste( - within(screen.getByLabelText('KSQL')).getByRole('textbox'), - 'show tables;' - ); - - userEvent.paste( - within( - screen.getByLabelText('Stream properties (JSON format)') - ).getByRole('textbox'), - '{"some":"json"}' - ); - - userEvent.type( - within( - screen.getByLabelText('Stream properties (JSON format)') - ).getByRole('textbox'), - '{ctrl}{enter}' - ); - }); - - expect(submitFn.mock.calls.length).toBe(1); - }); - - it('clears KSQL with Clear button', async () => { + it('add new property', async () => { renderComponent({ fetching: false, hasResults: false, @@ -267,22 +167,15 @@ describe('QueryForm', () => { }); await act(() => { - userEvent.paste( - within(screen.getByLabelText('KSQL')).getByRole('textbox'), - 'show tables;' - ); userEvent.click( - within(screen.getByLabelText('KSQL')).getByRole('button', { - name: 'Clear', - }) + screen.getByRole('button', { name: 'Add Stream Property' }) ); }); - - expect(screen.queryByText('show tables;')).not.toBeInTheDocument(); + expect(screen.getAllByRole('textbox', { name: 'key' }).length).toEqual(2); }); - it('clears streamProperties with Clear button', async () => { - renderComponent({ + it('delete stream property', async () => { + await renderComponent({ fetching: false, hasResults: false, handleClearResults: jest.fn(), @@ -291,20 +184,13 @@ describe('QueryForm', () => { }); await act(() => { - userEvent.paste( - within( - screen.getByLabelText('Stream properties (JSON format)') - ).getByRole('textbox'), - '{"some":"json"}' - ); userEvent.click( - within( - screen.getByLabelText('Stream properties (JSON format)') - ).getByRole('button', { - name: 'Clear', - }) + screen.getByRole('button', { name: 'Add Stream Property' }) ); }); - expect(screen.queryByText('{"some":"json"}')).not.toBeInTheDocument(); + await act(() => { + userEvent.click(screen.getAllByLabelText('deleteProperty')[0]); + }); + expect(screen.getAllByRole('textbox', { name: 'key' }).length).toEqual(1); }); }); diff --git a/kafka-ui-react-app/src/components/KsqlDb/Query/__test__/Query.spec.tsx b/kafka-ui-react-app/src/components/KsqlDb/Query/__test__/Query.spec.tsx index fd2cacfd7f1..985ebde5e5b 100644 --- a/kafka-ui-react-app/src/components/KsqlDb/Query/__test__/Query.spec.tsx +++ b/kafka-ui-react-app/src/components/KsqlDb/Query/__test__/Query.spec.tsx @@ -3,11 +3,11 @@ import React from 'react'; import Query, { getFormattedErrorFromTableData, } from 'components/KsqlDb/Query/Query'; -import { screen, within } from '@testing-library/dom'; +import { screen } from '@testing-library/dom'; import fetchMock from 'fetch-mock'; -import userEvent from '@testing-library/user-event'; import { clusterKsqlDbQueryPath } from 'lib/paths'; import { act } from '@testing-library/react'; +import userEvent from '@testing-library/user-event'; const clusterName = 'testLocal'; const renderComponent = () => @@ -25,9 +25,7 @@ describe('Query', () => { renderComponent(); expect(screen.getByLabelText('KSQL')).toBeInTheDocument(); - expect( - screen.getByLabelText('Stream properties (JSON format)') - ).toBeInTheDocument(); + expect(screen.getByLabelText('Stream properties:')).toBeInTheDocument(); }); afterEach(() => fetchMock.reset()); @@ -41,12 +39,10 @@ describe('Query', () => { Object.defineProperty(window, 'EventSource', { value: EventSourceMock, }); - + const inputs = screen.getAllByRole('textbox'); + const textAreaElement = inputs[0] as HTMLTextAreaElement; await act(() => { - userEvent.paste( - within(screen.getByLabelText('KSQL')).getByRole('textbox'), - 'show tables;' - ); + userEvent.paste(textAreaElement, 'show tables;'); userEvent.click(screen.getByRole('button', { name: 'Execute' })); }); @@ -63,47 +59,19 @@ describe('Query', () => { Object.defineProperty(window, 'EventSource', { value: EventSourceMock, }); - await act(() => { - userEvent.paste( - within(screen.getByLabelText('KSQL')).getByRole('textbox'), - 'show tables;' - ); - userEvent.paste( - within( - screen.getByLabelText('Stream properties (JSON format)') - ).getByRole('textbox'), - '{"some":"json"}' - ); - userEvent.click(screen.getByRole('button', { name: 'Execute' })); + const inputs = screen.getAllByRole('textbox'); + const textAreaElement = inputs[0] as HTMLTextAreaElement; + userEvent.paste(textAreaElement, 'show tables;'); }); - expect(mock.calls().length).toBe(1); - }); - - it('fetch on execute with streamParams', async () => { - renderComponent(); - - const mock = fetchMock.postOnce(`/api/clusters/${clusterName}/ksql/v2`, { - pipeId: 'testPipeID', - }); - - Object.defineProperty(window, 'EventSource', { - value: EventSourceMock, + await act(() => { + userEvent.paste(screen.getByLabelText('key'), 'key'); + userEvent.paste(screen.getByLabelText('value'), 'value'); }); - await act(() => { - userEvent.paste( - within(screen.getByLabelText('KSQL')).getByRole('textbox'), - 'show tables;' - ); - userEvent.paste( - within( - screen.getByLabelText('Stream properties (JSON format)') - ).getByRole('textbox'), - '{"some":"json"}' - ); userEvent.click(screen.getByRole('button', { name: 'Execute' })); }); + expect(mock.calls().length).toBe(1); }); });
null
train
train
2022-07-07T17:50:26
"2022-05-12T13:38:27Z"
Haarolean
train
provectus/kafka-ui/2154_2192
provectus/kafka-ui
provectus/kafka-ui/2154
provectus/kafka-ui/2192
[ "timestamp(timedelta=0.0, similarity=0.8734458374460558)", "connected" ]
3c922bc4706e44f0a6aaa33210b76530e841f56c
aa839b4d69106e50500d4a80ac4e65a26c9c8bc7
[]
[]
"2022-06-21T11:36:14Z"
[ "type/bug", "good first issue", "scope/frontend", "status/accepted", "status/confirmed" ]
Add asterisk for required Custom parameter field within Create Topic form
**Describe the bug** Please add an asterisk for Custom parameter required field **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** Steps to reproduce the behavior: 1. Navigate to Topics 2. Press Add a Topic 3. Add Custom Parameter 4. Press submit **Expected behavior** The required field should have the asterisk to be filled by user <img width="1718" alt="required custom parameter" src="https://user-images.githubusercontent.com/104780608/173297195-b600c3b7-b669-496c-ab35-f272bb3b9cbb.png">
[ "kafka-ui-react-app/src/components/Topics/shared/Form/CustomParams/CustomParamField.tsx" ]
[ "kafka-ui-react-app/src/components/Topics/shared/Form/CustomParams/CustomParamField.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/Topics/shared/Form/CustomParams/CustomParamField.tsx b/kafka-ui-react-app/src/components/Topics/shared/Form/CustomParams/CustomParamField.tsx index 72f28c5c461..2b85b5276c7 100644 --- a/kafka-ui-react-app/src/components/Topics/shared/Form/CustomParams/CustomParamField.tsx +++ b/kafka-ui-react-app/src/components/Topics/shared/Form/CustomParams/CustomParamField.tsx @@ -67,7 +67,7 @@ const CustomParamField: React.FC<Props> = ({ return ( <C.Column> <div> - <InputLabel>Custom Parameter</InputLabel> + <InputLabel>Custom Parameter *</InputLabel> <Controller control={control} rules={{ required: 'Custom Parameter is required.' }}
null
val
train
2022-06-21T17:53:11
"2022-06-13T06:58:55Z"
armenuikafka
train
provectus/kafka-ui/2182_2193
provectus/kafka-ui
provectus/kafka-ui/2182
provectus/kafka-ui/2193
[ "connected" ]
8acbcbacb9c04e6f4f2bcc95c1a9f9480804cccf
cc0a98262b1116e1bc4d4eed00fa43614eac1ddf
[]
[]
"2022-06-21T11:41:43Z"
[ "good first issue", "scope/frontend", "status/accepted", "type/chore" ]
Move 'x' icon to be in the top right of error message
**Describe the bug** Move 'x' icon to be in the top right of error message **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** Steps to reproduce the behavior: 1. Navigate to Schema Registry 2. Edit the Schema 3. Let any field empty 4. Submit the schema 5. Check the error message **Expected behavior** Move the 'x' icon to be in the top right of error message **Screenshots** <img width="1718" alt="error message schema" src="https://user-images.githubusercontent.com/104780608/174236075-9510ae79-9483-4a6c-9598-795a4d40be0e.png">
[ "kafka-ui-react-app/src/components/Alerts/Alert.styled.ts" ]
[ "kafka-ui-react-app/src/components/Alerts/Alert.styled.ts" ]
[]
diff --git a/kafka-ui-react-app/src/components/Alerts/Alert.styled.ts b/kafka-ui-react-app/src/components/Alerts/Alert.styled.ts index 131fc922437..298f8749032 100644 --- a/kafka-ui-react-app/src/components/Alerts/Alert.styled.ts +++ b/kafka-ui-react-app/src/components/Alerts/Alert.styled.ts @@ -9,7 +9,7 @@ export const Alert = styled.div<{ $type: AlertType }>` padding: 12px; display: flex; justify-content: space-between; - align-items: center; + align-items: baseline; filter: drop-shadow(0px 4px 16px ${({ theme }) => theme.alert.shadow}); margin-top: 10px; line-height: 20px;
null
train
train
2022-06-27T12:07:47
"2022-06-17T06:17:45Z"
armenuikafka
train
provectus/kafka-ui/2181_2194
provectus/kafka-ui
provectus/kafka-ui/2181
provectus/kafka-ui/2194
[ "timestamp(timedelta=0.0, similarity=1.0)", "connected" ]
cc0a98262b1116e1bc4d4eed00fa43614eac1ddf
d7a3629470606a644c3bcc82f093aa609c6ac29c
[]
[]
"2022-06-21T12:01:57Z"
[ "good first issue", "scope/frontend", "status/accepted", "type/chore" ]
Change the name of path for Schema Compare versions
**Describe the bug** Change the name of path for Schema Compare versions **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** Steps to reproduce the behavior: 1. Navigate to Schema Registry 2. Open the Schema 3. Press Compare Versions 4. Check the Path **Expected behavior** Could be better to display 'Compare' instead of 'Diff' for Schema versions compare **Screenshots** <img width="1718" alt="vversions schema" src="https://user-images.githubusercontent.com/104780608/174235375-2003fb89-f2b0-421c-bbd1-6397db8c4b94.png">
[ "kafka-ui-react-app/src/components/Schemas/Details/Details.tsx", "kafka-ui-react-app/src/components/Schemas/Diff/Diff.tsx", "kafka-ui-react-app/src/components/Schemas/Diff/__test__/Diff.spec.tsx", "kafka-ui-react-app/src/lib/paths.ts" ]
[ "kafka-ui-react-app/src/components/Schemas/Details/Details.tsx", "kafka-ui-react-app/src/components/Schemas/Diff/Diff.tsx", "kafka-ui-react-app/src/components/Schemas/Diff/__test__/Diff.spec.tsx", "kafka-ui-react-app/src/lib/paths.ts" ]
[]
diff --git a/kafka-ui-react-app/src/components/Schemas/Details/Details.tsx b/kafka-ui-react-app/src/components/Schemas/Details/Details.tsx index 386fc505903..299b0984b13 100644 --- a/kafka-ui-react-app/src/components/Schemas/Details/Details.tsx +++ b/kafka-ui-react-app/src/components/Schemas/Details/Details.tsx @@ -3,7 +3,7 @@ import { useNavigate } from 'react-router-dom'; import { ClusterSubjectParam, clusterSchemaEditPageRelativePath, - clusterSchemaSchemaDiffPageRelativePath, + clusterSchemaSchemaComparePageRelativePath, } from 'lib/paths'; import ClusterContext from 'components/contexts/ClusterContext'; import ConfirmationModal from 'components/common/ConfirmationModal/ConfirmationModal'; @@ -90,7 +90,7 @@ const Details: React.FC = () => { buttonSize="M" buttonType="primary" to={{ - pathname: clusterSchemaSchemaDiffPageRelativePath, + pathname: clusterSchemaSchemaComparePageRelativePath, search: `leftVersion=${versions[0]?.version}&rightVersion=${versions[0]?.version}`, }} > diff --git a/kafka-ui-react-app/src/components/Schemas/Diff/Diff.tsx b/kafka-ui-react-app/src/components/Schemas/Diff/Diff.tsx index 4c12bc66fa4..94636fa4b8f 100644 --- a/kafka-ui-react-app/src/components/Schemas/Diff/Diff.tsx +++ b/kafka-ui-react-app/src/components/Schemas/Diff/Diff.tsx @@ -1,6 +1,6 @@ import React from 'react'; import { SchemaSubject } from 'generated-sources'; -import { clusterSchemaSchemaDiffPath, ClusterSubjectParam } from 'lib/paths'; +import { clusterSchemaSchemaComparePath, ClusterSubjectParam } from 'lib/paths'; import PageLoader from 'components/common/PageLoader/PageLoader'; import DiffViewer from 'components/common/DiffViewer/DiffViewer'; import { useNavigate, useLocation } from 'react-router-dom'; @@ -86,7 +86,7 @@ const Diff: React.FC<DiffProps> = ({ versions, areVersionsFetched }) => { } onChange={(event) => { navigate( - clusterSchemaSchemaDiffPath(clusterName, subject) + clusterSchemaSchemaComparePath(clusterName, subject) ); searchParams.set('leftVersion', event.toString()); searchParams.set( @@ -127,7 +127,7 @@ const Diff: React.FC<DiffProps> = ({ versions, areVersionsFetched }) => { } onChange={(event) => { navigate( - clusterSchemaSchemaDiffPath(clusterName, subject) + clusterSchemaSchemaComparePath(clusterName, subject) ); searchParams.set( 'leftVersion', diff --git a/kafka-ui-react-app/src/components/Schemas/Diff/__test__/Diff.spec.tsx b/kafka-ui-react-app/src/components/Schemas/Diff/__test__/Diff.spec.tsx index 418393bb503..029e589b9ad 100644 --- a/kafka-ui-react-app/src/components/Schemas/Diff/__test__/Diff.spec.tsx +++ b/kafka-ui-react-app/src/components/Schemas/Diff/__test__/Diff.spec.tsx @@ -2,13 +2,13 @@ import React from 'react'; import Diff, { DiffProps } from 'components/Schemas/Diff/Diff'; import { render, WithRoute } from 'lib/testHelpers'; import { screen } from '@testing-library/react'; -import { clusterSchemaSchemaDiffPath } from 'lib/paths'; +import { clusterSchemaSchemaComparePath } from 'lib/paths'; import { versions } from './fixtures'; const defaultClusterName = 'defaultClusterName'; const defaultSubject = 'defaultSubject'; -const defaultPathName = clusterSchemaSchemaDiffPath( +const defaultPathName = clusterSchemaSchemaComparePath( defaultClusterName, defaultSubject ); @@ -30,7 +30,7 @@ describe('Diff', () => { pathname = `${pathname}?${searchParams.toString()}`; return render( - <WithRoute path={clusterSchemaSchemaDiffPath()}> + <WithRoute path={clusterSchemaSchemaComparePath()}> <Diff versions={props.versions} areVersionsFetched={props.areVersionsFetched} diff --git a/kafka-ui-react-app/src/lib/paths.ts b/kafka-ui-react-app/src/lib/paths.ts index 2e15a856a86..4f80d07c14c 100644 --- a/kafka-ui-react-app/src/lib/paths.ts +++ b/kafka-ui-react-app/src/lib/paths.ts @@ -72,9 +72,9 @@ export type ClusterGroupParam = { export const clusterSchemasRelativePath = 'schemas'; export const clusterSchemaNewRelativePath = 'create-new'; export const clusterSchemaEditPageRelativePath = `edit`; -export const clusterSchemaSchemaDiffPageRelativePath = `diff`; +export const clusterSchemaSchemaComparePageRelativePath = `compare`; export const clusterSchemaEditRelativePath = `${RouteParams.subject}/${clusterSchemaEditPageRelativePath}`; -export const clusterSchemaSchemaDiffRelativePath = `${RouteParams.subject}/${clusterSchemaSchemaDiffPageRelativePath}`; +export const clusterSchemaSchemaDiffRelativePath = `${RouteParams.subject}/${clusterSchemaSchemaComparePageRelativePath}`; export const clusterSchemasPath = ( clusterName: ClusterName = RouteParams.clusterName ) => `${clusterPath(clusterName)}/schemas`; @@ -89,10 +89,10 @@ export const clusterSchemaEditPath = ( clusterName: ClusterName = RouteParams.clusterName, subject: SchemaName = RouteParams.subject ) => `${clusterSchemasPath(clusterName)}/${subject}/edit`; -export const clusterSchemaSchemaDiffPath = ( +export const clusterSchemaSchemaComparePath = ( clusterName: ClusterName = RouteParams.clusterName, subject: SchemaName = RouteParams.subject -) => `${clusterSchemaPath(clusterName, subject)}/diff`; +) => `${clusterSchemaPath(clusterName, subject)}/compare`; export type ClusterSubjectParam = { subject: string;
null
train
train
2022-06-27T14:24:47
"2022-06-17T06:04:35Z"
armenuikafka
train
provectus/kafka-ui/2167_2201
provectus/kafka-ui
provectus/kafka-ui/2167
provectus/kafka-ui/2201
[ "timestamp(timedelta=0.0, similarity=0.8783625712732184)", "connected" ]
9c7f078dedf40e1d87a21a4434aeff9466f12f53
5cb1a7e0ce53b673a0e5b5133b18818f13088568
[ "Hello there tcortega! πŸ‘‹\n\nThank you and congratulations πŸŽ‰ for opening your very first issue in this project! πŸ’–\n\nIn case you want to claim this issue, please comment down below! We will try to get back to you as soon as we can. πŸ‘€", "Hello @tcortega ,\r\nchange that you are doing is not backward compatible, since you don't provide new field default. \r\nyou can either change subject compatibility (to forwards for example) or make change backward compatible. \r\n\r\nAs for error message I agree that it is not informative, we will check how it can be fixed.\r\n", "> Hello @tcortega , change that you are doing is not backward compatible, since you don't provide new field default. you can either change subject compatibility (to forwards for example) or make change backward compatible.\r\n> \r\n> As for error message I agree that it is not informative, we will check how it can be fixed.\r\n\r\nThe same error happens when the compatibility level is set to forward\r\n\r\n![image](https://user-images.githubusercontent.com/60912483/173867069-b1d041e4-1b29-4dfa-a3ba-b6a79ed44295.png)\r\n", "@tcortega hey, there's currently a bug in case you're trying to change both compatibility level and the schema itself at the same time. As a workaround please do change the level first, save it, and then edit the schema. Meanwhile we'll fix the bug.", "@provectus/kafka-frontend \r\n\r\nWhen changing a schema, if both compatibility level and the schema itself are changed, there are two requests sent to backend:\r\n1. update schema\r\n2. update compatibility level\r\n\r\nand the order of these is incorrect. It should be vice versa." ]
[]
"2022-06-22T13:49:40Z"
[ "type/bug", "good first issue", "scope/frontend", "status/accepted", "status/confirmed" ]
Incorrect order of requests for changing SR compatibility lvl and schema itself
TODO: https://github.com/provectus/kafka-ui/issues/2167#issuecomment-1158835488 **Describe the bug** When attempting to edit a schema within the schema registry screen, I get an error that does not give me any information. **Set up** If needed I can provide later-on, but we basically set up a simple kafka cluster and the schema registry itself, using confluent's docker image. **Steps to Reproduce** Steps to reproduce the behavior: 1. Open the Schema registry screen 2. Create a new AVRO schema, the one we created is the following: `{ "type" : "record", "namespace" : "DataFlair", "name" : "Student", "fields" : [ { "name" : "Name" , "type" : "string" }, { "name" : "Age" , "type" : "int" } ] }` 3. Try to edit it, choosing any compatibility mode. **Expected Behavior** An error message should appear, saying no useful information. **Screenshots** ![image](https://user-images.githubusercontent.com/60912483/173689737-7603bc17-733e-444c-968e-b106e7bfa9b4.png)
[ "kafka-ui-react-app/src/components/Schemas/Edit/Edit.tsx" ]
[ "kafka-ui-react-app/src/components/Schemas/Edit/Edit.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/Schemas/Edit/Edit.tsx b/kafka-ui-react-app/src/components/Schemas/Edit/Edit.tsx index 550496d7451..1b960675ec6 100644 --- a/kafka-ui-react-app/src/components/Schemas/Edit/Edit.tsx +++ b/kafka-ui-react-app/src/components/Schemas/Edit/Edit.tsx @@ -62,18 +62,6 @@ const Edit: React.FC = () => { if (!schema) return; try { - if (dirtyFields.newSchema || dirtyFields.schemaType) { - const resp = await schemasApiClient.createNewSchema({ - clusterName, - newSchemaSubject: { - ...schema, - schema: props.newSchema || schema.schema, - schemaType: props.schemaType || schema.schemaType, - }, - }); - dispatch(schemaAdded(resp)); - } - if (dirtyFields.compatibilityLevel) { await schemasApiClient.updateSchemaCompatibilityLevel({ clusterName, @@ -89,6 +77,17 @@ const Edit: React.FC = () => { }) ); } + if (dirtyFields.newSchema || dirtyFields.schemaType) { + const resp = await schemasApiClient.createNewSchema({ + clusterName, + newSchemaSubject: { + ...schema, + schema: props.newSchema || schema.schema, + schemaType: props.schemaType || schema.schemaType, + }, + }); + dispatch(schemaAdded(resp)); + } navigate(clusterSchemaPath(clusterName, subject)); } catch (e) {
null
test
train
2022-06-23T15:59:30
"2022-06-14T21:14:48Z"
tcortega
train
provectus/kafka-ui/1500_2205
provectus/kafka-ui
provectus/kafka-ui/1500
provectus/kafka-ui/2205
[ "timestamp(timedelta=1.0, similarity=0.9896843102954588)", "connected" ]
13d168c8a55df8e5cb5f3b5359f6893a1461bc3c
819ae60e6b3b9d598552c3db1e40f8c9cbb8c125
[ "Hello there sbritprovectus! πŸ‘‹\n\nThank you and congratulations πŸŽ‰ for opening your very first issue in this project! πŸ’–\n\nIn case you want to claim this issue, please comment down below! We will try to get back to you as soon as we can. πŸ‘€", "Almost all fields are not set according to the actual values but rather contain the widgets default value. I'm happy to take a look at this." ]
[ "I'm not sure why this is here. I can see you reassign it `formInit = false;` on line 79 inside useEffect. This is not a good way to do, instead use useState, or if you don't need to trigger rerenders use useRef to store this variable.", "Just create a ./fixtures file somewhere near this test file to separate such huge objects from the tests.", "```suggestion\r\nconst startParams: TopicWithDetailedInfo = {\r\n```", "```suggestion\r\n source: ConfigSource.DEFAULT_CONFIG,\r\n```", "check this file. We already have fixtures\r\n```\r\nsrc/components/Topics/Topic/Edit/__test__/fixtures.ts\r\n```", "pls cover all the cases it is not complex", "it returns default values when topic is not defined", "describe maxMessageBytes\r\n - it \r\n - it convert to number\r\n - it returns default value when ", "test case", "```suggestion\r\n it('returns completed values', () => {\r\n```", "pls fix it for all cases", "```suggestion\r\n it('returns transformed values', () => {\r\n```", "```suggestion\r\n transformedParams.partitions\r\n```", "move payload to fixtures file", "I would prefer to move `|| -1` logic inside of `getValue`. So add one more `defaultValue` param", "```suggestion\r\n it('returns transformed value', () => {\r\n```", "```suggestion\r\n it(`returns default value when ${name} not defined`, () => {\r\n```", "```suggestion\r\n it('returns number value', () => {\r\n```", "```suggestion\r\n it('returns value when field exists', () => {\r\n```", "```suggestion\r\n it('returns default value when field does not exist', () => {\r\n```", "```suggestion\r\n it('returns value when filed name does not exist', () => {\r\n```", "```suggestion\r\n it('returns transformed value', () => {\r\n```", "```suggestion\r\n it('returns default value when partitionCount not defined', () => {\r\n```", "```suggestion\r\n describe('customParams', () => {\r\n it('returns value when configs is empty', () => {\r\n```", "Are you sure we can use `-1` by default? ping @Haarolean ", "Uuhm is that like for all the int params? Seems dangerous. How did it work before?", "I added that. There was no such thing.", "```suggestion\r\n defaultValue?: string | number\r\n```", "returns" ]
"2022-06-24T12:05:44Z"
[ "type/bug", "good first issue", "scope/frontend", "status/accepted", "status/confirmed" ]
[Topics] There are default values when user try to edit topic
**Describe the bug** When user try to edit topic after creation there are defaults values in edit topic settings view **Set up** Local docker. App version: Version:v0.3.3(38c4cf7) **Steps to Reproduce** 1. Create topic (e.g. using API) 2. Go to newly created topic 3. Go to Edit Settings 4. Edit something (e.g. change Cleanup policy) 5. Save 6. Go to Edit settings again 7. Observe the view **Expected behavior** At Edit settings view should be current topic's values **Actual result** All setting are default
[ "kafka-ui-react-app/src/components/Topics/New/New.tsx", "kafka-ui-react-app/src/components/Topics/Topic/Edit/Edit.tsx", "kafka-ui-react-app/src/components/Topics/Topic/Edit/__test__/fixtures.ts", "kafka-ui-react-app/src/components/Topics/shared/Form/TopicForm.tsx", "kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts" ]
[ "kafka-ui-react-app/src/components/Topics/New/New.tsx", "kafka-ui-react-app/src/components/Topics/Topic/Edit/Edit.tsx", "kafka-ui-react-app/src/components/Topics/Topic/Edit/__test__/fixtures.ts", "kafka-ui-react-app/src/components/Topics/Topic/Edit/__test__/topicParamsTransformer.spec.ts", "kafka-ui-react-app/src/components/Topics/Topic/Edit/topicParamsTransformer.ts", "kafka-ui-react-app/src/components/Topics/shared/Form/TopicForm.tsx", "kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts" ]
[]
diff --git a/kafka-ui-react-app/src/components/Topics/New/New.tsx b/kafka-ui-react-app/src/components/Topics/New/New.tsx index 7346820afce..7faefb36c9d 100644 --- a/kafka-ui-react-app/src/components/Topics/New/New.tsx +++ b/kafka-ui-react-app/src/components/Topics/New/New.tsx @@ -57,7 +57,7 @@ const New: React.FC = () => { partitionCount={Number(partitionCount)} replicationFactor={Number(replicationFactor)} inSyncReplicas={Number(inSyncReplicas)} - isSubmitting={methods.formState.isSubmitting} + isSubmitting={false} onSubmit={methods.handleSubmit(onSubmit)} /> </FormProvider> diff --git a/kafka-ui-react-app/src/components/Topics/Topic/Edit/Edit.tsx b/kafka-ui-react-app/src/components/Topics/Topic/Edit/Edit.tsx index 807c6e9c7b1..76309dc182d 100644 --- a/kafka-ui-react-app/src/components/Topics/Topic/Edit/Edit.tsx +++ b/kafka-ui-react-app/src/components/Topics/Topic/Edit/Edit.tsx @@ -1,10 +1,9 @@ -import React from 'react'; +import React, { useEffect } from 'react'; import { ClusterName, TopicFormDataRaw, TopicName, TopicConfigByName, - TopicWithDetailedInfo, TopicFormData, } from 'redux/interfaces'; import { useForm, FormProvider } from 'react-hook-form'; @@ -13,12 +12,13 @@ import { RouteParamsClusterTopic } from 'lib/paths'; import { useNavigate } from 'react-router-dom'; import { yupResolver } from '@hookform/resolvers/yup'; import { topicFormValidationSchema } from 'lib/yupExtended'; -import { TOPIC_CUSTOM_PARAMS_PREFIX, TOPIC_CUSTOM_PARAMS } from 'lib/constants'; import styled from 'styled-components'; import PageHeading from 'components/common/PageHeading/PageHeading'; import { useAppSelector } from 'lib/hooks/redux'; import { getFullTopic } from 'redux/reducers/topics/selectors'; import useAppParams from 'lib/hooks/useAppParams'; +import topicParamsTransformer from 'components/Topics/Topic/Edit/topicParamsTransformer'; +import { MILLISECONDS_IN_WEEK } from 'lib/constants'; import DangerZoneContainer from './DangerZone/DangerZoneContainer'; @@ -39,6 +39,7 @@ export interface Props { const EditWrapperStyled = styled.div` display: flex; justify-content: center; + & > * { width: 800px; } @@ -50,29 +51,9 @@ export const DEFAULTS = { minInSyncReplicas: 1, cleanupPolicy: 'delete', retentionBytes: -1, + retentionMs: MILLISECONDS_IN_WEEK, maxMessageBytes: 1000012, -}; - -const topicParams = (topic: TopicWithDetailedInfo | undefined) => { - if (!topic) { - return DEFAULTS; - } - - const { name, replicationFactor } = topic; - - return { - ...DEFAULTS, - name, - partitions: topic.partitionCount || DEFAULTS.partitions, - replicationFactor, - [TOPIC_CUSTOM_PARAMS_PREFIX]: topic.config - ?.filter( - (el) => - el.value !== el.defaultValue && - Object.keys(TOPIC_CUSTOM_PARAMS).includes(el.name) - ) - .map((el) => ({ name: el.name, value: el.value })), - }; + customParams: [], }; let formInit = false; @@ -87,7 +68,7 @@ const Edit: React.FC<Props> = ({ const topic = useAppSelector((state) => getFullTopic(state, topicName)); - const defaultValues = React.useMemo(() => topicParams(topic), [topic]); + const defaultValues = topicParamsTransformer(topic); const methods = useForm<TopicFormData>({ defaultValues, @@ -95,12 +76,16 @@ const Edit: React.FC<Props> = ({ mode: 'onChange', }); + useEffect(() => { + methods.reset(defaultValues); + }, [!topic]); + const [isSubmitting, setIsSubmitting] = React.useState<boolean>(false); const navigate = useNavigate(); React.useEffect(() => { fetchTopicConfig({ clusterName, topicName }); - }, [fetchTopicConfig, clusterName, topicName]); + }, [fetchTopicConfig, clusterName, topicName, isTopicUpdated]); React.useEffect(() => { if (isSubmitting && isTopicUpdated) { @@ -138,7 +123,10 @@ const Edit: React.FC<Props> = ({ <FormProvider {...methods}> <TopicForm topicName={topicName} + retentionBytes={defaultValues.retentionBytes} + inSyncReplicas={Number(defaultValues.minInSyncReplicas)} isSubmitting={isSubmitting} + cleanUpPolicy={topic.cleanUpPolicy} isEditing onSubmit={methods.handleSubmit(onSubmit)} /> diff --git a/kafka-ui-react-app/src/components/Topics/Topic/Edit/__test__/fixtures.ts b/kafka-ui-react-app/src/components/Topics/Topic/Edit/__test__/fixtures.ts index 49b7f033751..5cbdebd97aa 100644 --- a/kafka-ui-react-app/src/components/Topics/Topic/Edit/__test__/fixtures.ts +++ b/kafka-ui-react-app/src/components/Topics/Topic/Edit/__test__/fixtures.ts @@ -551,3 +551,67 @@ export const topicWithInfo: TopicWithDetailedInfo = { partitions, config, }; +export const customConfigs = [ + { + name: 'segment.bytes', + value: '1', + defaultValue: '1073741824', + source: ConfigSource.DEFAULT_CONFIG, + isSensitive: false, + isReadOnly: false, + synonyms: [ + { + name: 'log.segment.bytes', + value: '1073741824', + source: ConfigSource.DEFAULT_CONFIG, + }, + ], + }, + { + name: 'retention.ms', + value: '604', + defaultValue: '604800000', + source: ConfigSource.DYNAMIC_TOPIC_CONFIG, + isSensitive: false, + isReadOnly: false, + synonyms: [ + { + name: 'retention.ms', + value: '604800000', + source: ConfigSource.DYNAMIC_TOPIC_CONFIG, + }, + ], + }, + { + name: 'flush.messages', + value: '92233', + defaultValue: '9223372036854775807', + source: ConfigSource.DEFAULT_CONFIG, + isSensitive: false, + isReadOnly: false, + synonyms: [ + { + name: 'log.flush.interval.messages', + value: '9223372036854775807', + source: ConfigSource.DEFAULT_CONFIG, + }, + ], + }, +]; + +export const transformedParams = { + partitions: 1, + replicationFactor: 1, + cleanupPolicy: 'delete', + retentionBytes: -1, + maxMessageBytes: 1000012, + name: topicName, + minInSyncReplicas: 1, + retentionMs: 604800000, + customParams: [ + { + name: 'delete.retention.ms', + value: '86400001', + }, + ], +}; diff --git a/kafka-ui-react-app/src/components/Topics/Topic/Edit/__test__/topicParamsTransformer.spec.ts b/kafka-ui-react-app/src/components/Topics/Topic/Edit/__test__/topicParamsTransformer.spec.ts new file mode 100644 index 00000000000..5077dc3b29e --- /dev/null +++ b/kafka-ui-react-app/src/components/Topics/Topic/Edit/__test__/topicParamsTransformer.spec.ts @@ -0,0 +1,101 @@ +import topicParamsTransformer, { + getValue, +} from 'components/Topics/Topic/Edit/topicParamsTransformer'; +import { DEFAULTS } from 'components/Topics/Topic/Edit/Edit'; + +import { transformedParams, customConfigs, topicWithInfo } from './fixtures'; + +describe('topicParamsTransformer', () => { + const testField = (name: keyof typeof DEFAULTS, fieldName: string) => { + it('returns transformed value', () => { + expect(topicParamsTransformer(topicWithInfo)[name]).toEqual( + transformedParams[name] + ); + }); + it(`returns default value when ${name} not defined`, () => { + expect( + topicParamsTransformer({ + ...topicWithInfo, + config: topicWithInfo.config?.filter( + (config) => config.name !== fieldName + ), + })[name] + ).toEqual(DEFAULTS[name]); + }); + + it('returns number value', () => { + expect( + typeof topicParamsTransformer(topicWithInfo).retentionBytes + ).toEqual('number'); + }); + }; + + describe('getValue', () => { + it('returns value when field exists', () => { + expect( + getValue(topicWithInfo, 'confluent.tier.segment.hotset.roll.min.bytes') + ).toEqual(104857600); + }); + it('returns undefined when filed name does not exist', () => { + expect(getValue(topicWithInfo, 'some.unsupported.fieldName')).toEqual( + undefined + ); + }); + it('returns default value when field does not exist', () => { + expect( + getValue(topicWithInfo, 'some.unsupported.fieldName', 100) + ).toEqual(100); + }); + }); + describe('Topic', () => { + it('returns default values when topic not defined found', () => { + expect(topicParamsTransformer(undefined)).toEqual(DEFAULTS); + }); + + it('returns transformed values', () => { + expect(topicParamsTransformer(topicWithInfo)).toEqual(transformedParams); + }); + }); + + describe('Topic partitions', () => { + it('returns transformed value', () => { + expect(topicParamsTransformer(topicWithInfo).partitions).toEqual( + transformedParams.partitions + ); + }); + it('returns default value when partitionCount not defined', () => { + expect( + topicParamsTransformer({ ...topicWithInfo, partitionCount: undefined }) + .partitions + ).toEqual(DEFAULTS.partitions); + }); + }); + + describe('maxMessageBytes', () => + testField('maxMessageBytes', 'max.message.bytes')); + + describe('minInSyncReplicas', () => + testField('minInSyncReplicas', 'min.insync.replicas')); + + describe('retentionBytes', () => + testField('retentionBytes', 'retention.bytes')); + + describe('retentionMs', () => testField('retentionMs', 'retention.ms')); + + describe('customParams', () => { + it('returns value when configs is empty', () => { + expect( + topicParamsTransformer({ ...topicWithInfo, config: [] }).customParams + ).toEqual([]); + }); + + it('returns value when had a 2 custom configs', () => { + expect( + topicParamsTransformer({ + ...topicWithInfo, + config: customConfigs, + }).customParams?.length + ).toEqual(2); + }); + }); +}); diff --git a/kafka-ui-react-app/src/components/Topics/Topic/Edit/topicParamsTransformer.ts b/kafka-ui-react-app/src/components/Topics/Topic/Edit/topicParamsTransformer.ts new file mode 100644 index 00000000000..1c5b8491c69 --- /dev/null +++ b/kafka-ui-react-app/src/components/Topics/Topic/Edit/topicParamsTransformer.ts @@ -0,0 +1,43 @@ +import { TopicWithDetailedInfo } from 'redux/interfaces'; +import { + MILLISECONDS_IN_WEEK, + TOPIC_CUSTOM_PARAMS, + TOPIC_CUSTOM_PARAMS_PREFIX, +} from 'lib/constants'; +import { DEFAULTS } from 'components/Topics/Topic/Edit/Edit'; + +export const getValue = ( + topic: TopicWithDetailedInfo, + fieldName: string, + defaultValue?: number +) => + Number(topic?.config?.find((config) => config.name === fieldName)?.value) || + defaultValue; + +const topicParamsTransformer = (topic?: TopicWithDetailedInfo) => { + if (!topic) { + return DEFAULTS; + } + + const { name, replicationFactor } = topic; + + return { + ...DEFAULTS, + name, + replicationFactor, + partitions: topic.partitionCount || DEFAULTS.partitions, + maxMessageBytes: getValue(topic, 'max.message.bytes', 1000012), + minInSyncReplicas: getValue(topic, 'min.insync.replicas', 1), + retentionBytes: getValue(topic, 'retention.bytes', -1), + retentionMs: getValue(topic, 'retention.ms', MILLISECONDS_IN_WEEK), + + [TOPIC_CUSTOM_PARAMS_PREFIX]: topic.config + ?.filter( + (el) => + el.value !== el.defaultValue && + Object.keys(TOPIC_CUSTOM_PARAMS).includes(el.name) + ) + .map((el) => ({ name: el.name, value: el.value })), + }; +}; +export default topicParamsTransformer; diff --git a/kafka-ui-react-app/src/components/Topics/shared/Form/TopicForm.tsx b/kafka-ui-react-app/src/components/Topics/shared/Form/TopicForm.tsx index d2cb51fae5d..13afbe12335 100644 --- a/kafka-ui-react-app/src/components/Topics/shared/Form/TopicForm.tsx +++ b/kafka-ui-react-app/src/components/Topics/shared/Form/TopicForm.tsx @@ -19,6 +19,7 @@ export interface Props { partitionCount?: number; replicationFactor?: number; inSyncReplicas?: number; + retentionBytes?: number; cleanUpPolicy?: string; isEditing?: boolean; isSubmitting: boolean; @@ -40,6 +41,7 @@ const RetentionBytesOptions: Array<SelectOption> = [ ]; const TopicForm: React.FC<Props> = ({ + retentionBytes, topicName, isEditing, isSubmitting, @@ -55,8 +57,17 @@ const TopicForm: React.FC<Props> = ({ } = useFormContext(); const getCleanUpPolicy = CleanupPolicyOptions.find((option: SelectOption) => { - return option.value === cleanUpPolicy?.toLowerCase(); + return ( + option.value.toString().replace(/,/g, '_') === + cleanUpPolicy?.toLowerCase() + ); })?.value || CleanupPolicyOptions[0].value; + + const getRetentionBytes = + RetentionBytesOptions.find((option: SelectOption) => { + return option.value === retentionBytes; + })?.value || RetentionBytesOptions[0].value; + return ( <StyledForm onSubmit={onSubmit} aria-label="topic form"> <fieldset disabled={isSubmitting}> @@ -180,7 +191,7 @@ const TopicForm: React.FC<Props> = ({ id="topicFormRetentionBytes" aria-labelledby="topicFormRetentionBytesLabel" name={name} - value={RetentionBytesOptions[0].value} + value={getRetentionBytes} onChange={onChange} minWidth="100%" options={RetentionBytesOptions} diff --git a/kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts b/kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts index 271cc7798a9..4a661240197 100644 --- a/kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts +++ b/kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts @@ -169,12 +169,12 @@ const formatTopicUpdate = (form: TopicFormDataRaw): TopicUpdate => { return { configs: { + ...Object.values(customParams || {}).reduce(topicReducer, {}), 'cleanup.policy': cleanupPolicy, 'retention.ms': retentionMs, 'retention.bytes': retentionBytes, 'max.message.bytes': maxMessageBytes, 'min.insync.replicas': minInSyncReplicas, - ...Object.values(customParams || {}).reduce(topicReducer, {}), }, }; }; @@ -355,7 +355,7 @@ export const clearTopicsMessages = createAsyncThunk< } }); -const initialState: TopicsState = { +export const initialState: TopicsState = { byName: {}, allNames: [], totalPages: 1,
null
train
train
2022-08-01T18:18:24
"2022-01-27T07:44:00Z"
sbritprovectus
train
provectus/kafka-ui/2039_2221
provectus/kafka-ui
provectus/kafka-ui/2039
provectus/kafka-ui/2221
[ "connected" ]
002e4db355fdc189ca0f2fe0b90052b07966c809
7845476af1245b5113861ff3669fc10cc4678a6d
[]
[]
"2022-06-29T08:41:31Z"
[ "type/bug", "good first issue", "scope/frontend", "status/accepted", "status/confirmed" ]
Make Submit button inactive when required fields aren't filled
**Describe the bug** Submit button is inactive by default in create forms, it should stay inactive also in case of once pressing the input fields and letting it empty **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** Steps to reproduce the behavior: 1. Navigate to schema registry 2. Create schema 3. Make sure submit button is inactive 4. Click into required input field 5. Make sure submit button became active 6. Click out of the field **Expected behavior** Submit button should stay inactive if required data is missing from the forms: _**Create new Topic, Produce Message, Create new schema, Create Connector**_ **Screenshots** ![inactive submit](https://user-images.githubusercontent.com/104780608/170481725-0937c7ee-9711-4a14-82ae-00945fe626a0.png)
[ "kafka-ui-react-app/src/components/Schemas/New/New.tsx", "kafka-ui-react-app/src/components/Topics/New/__test__/New.spec.tsx", "kafka-ui-react-app/src/components/Topics/Topic/Edit/Edit.tsx", "kafka-ui-react-app/src/components/Topics/Topic/Edit/__test__/Edit.spec.tsx", "kafka-ui-react-app/src/components/Topics/shared/Form/TopicForm.tsx", "kafka-ui-react-app/src/components/Topics/shared/Form/__tests__/TopicForm.spec.tsx", "kafka-ui-react-app/src/lib/yupExtended.ts", "kafka-ui-react-app/src/redux/interfaces/topic.ts", "kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts", "kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts" ]
[ "kafka-ui-react-app/src/components/Schemas/New/New.tsx", "kafka-ui-react-app/src/components/Topics/New/__test__/New.spec.tsx", "kafka-ui-react-app/src/components/Topics/Topic/Edit/Edit.tsx", "kafka-ui-react-app/src/components/Topics/Topic/Edit/__test__/Edit.spec.tsx", "kafka-ui-react-app/src/components/Topics/shared/Form/TopicForm.tsx", "kafka-ui-react-app/src/components/Topics/shared/Form/__tests__/TopicForm.spec.tsx", "kafka-ui-react-app/src/lib/yupExtended.ts", "kafka-ui-react-app/src/redux/interfaces/topic.ts", "kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts", "kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts" ]
[]
diff --git a/kafka-ui-react-app/src/components/Schemas/New/New.tsx b/kafka-ui-react-app/src/components/Schemas/New/New.tsx index 0d9d171b1f2..1aa843d6b59 100644 --- a/kafka-ui-react-app/src/components/Schemas/New/New.tsx +++ b/kafka-ui-react-app/src/components/Schemas/New/New.tsx @@ -32,12 +32,12 @@ const New: React.FC = () => { const { clusterName } = useAppParams<ClusterNameRoute>(); const navigate = useNavigate(); const dispatch = useAppDispatch(); - const methods = useForm<NewSchemaSubjectRaw>(); + const methods = useForm<NewSchemaSubjectRaw>({ mode: 'onChange' }); const { register, handleSubmit, control, - formState: { isDirty, isSubmitting, errors }, + formState: { isDirty, isSubmitting, errors, isValid }, } = methods; const onSubmit = async ({ @@ -99,15 +99,15 @@ const New: React.FC = () => { <div> <InputLabel>Schema Type *</InputLabel> <Controller - defaultValue={SchemaTypeOptions[0].value as SchemaType} control={control} rules={{ required: 'Schema Type is required.' }} name="schemaType" - render={({ field: { name, onChange } }) => ( + render={({ field: { name, onChange, value } }) => ( <Select selectSize="M" name={name} - value={SchemaTypeOptions[0].value} + defaultValue={SchemaTypeOptions[0].value} + value={value} onChange={onChange} minWidth="50%" disabled={isSubmitting} @@ -124,7 +124,7 @@ const New: React.FC = () => { buttonSize="M" buttonType="primary" type="submit" - disabled={isSubmitting || !isDirty} + disabled={!isValid || isSubmitting || !isDirty} > Submit </Button> diff --git a/kafka-ui-react-app/src/components/Topics/New/__test__/New.spec.tsx b/kafka-ui-react-app/src/components/Topics/New/__test__/New.spec.tsx index aec534ce3e3..e03a2cfa0b1 100644 --- a/kafka-ui-react-app/src/components/Topics/New/__test__/New.spec.tsx +++ b/kafka-ui-react-app/src/components/Topics/New/__test__/New.spec.tsx @@ -80,7 +80,12 @@ describe('New', () => { it('validates form', async () => { await act(() => renderComponent(clusterTopicNewPath(clusterName))); - userEvent.click(screen.getByText('Create topic')); + await waitFor(() => { + userEvent.type(screen.getByPlaceholderText('Topic Name'), topicName); + }); + await waitFor(() => { + userEvent.clear(screen.getByPlaceholderText('Topic Name')); + }); await waitFor(() => { expect(screen.getByText('name is a required field')).toBeInTheDocument(); }); @@ -96,8 +101,13 @@ describe('New', () => { await act(() => renderComponent(clusterTopicNewPath(clusterName))); - userEvent.type(screen.getByPlaceholderText('Topic Name'), topicName); - userEvent.click(screen.getByText('Create topic')); + await act(() => { + userEvent.type(screen.getByPlaceholderText('Topic Name'), topicName); + }); + + await act(() => { + userEvent.click(screen.getByText('Create topic')); + }); await waitFor(() => expect(mockNavigate).toBeCalledTimes(1)); expect(mockNavigate).toHaveBeenLastCalledWith(`../${topicName}`); @@ -127,6 +137,8 @@ describe('New', () => { await act(() => renderComponent(clusterTopicNewPath(clusterName))); await act(() => { userEvent.type(screen.getByPlaceholderText('Topic Name'), topicName); + }); + await act(() => { userEvent.click(screen.getByText('Create topic')); }); diff --git a/kafka-ui-react-app/src/components/Topics/Topic/Edit/Edit.tsx b/kafka-ui-react-app/src/components/Topics/Topic/Edit/Edit.tsx index 6de0b52545a..807c6e9c7b1 100644 --- a/kafka-ui-react-app/src/components/Topics/Topic/Edit/Edit.tsx +++ b/kafka-ui-react-app/src/components/Topics/Topic/Edit/Edit.tsx @@ -92,6 +92,7 @@ const Edit: React.FC<Props> = ({ const methods = useForm<TopicFormData>({ defaultValues, resolver: yupResolver(topicFormValidationSchema), + mode: 'onChange', }); const [isSubmitting, setIsSubmitting] = React.useState<boolean>(false); diff --git a/kafka-ui-react-app/src/components/Topics/Topic/Edit/__test__/Edit.spec.tsx b/kafka-ui-react-app/src/components/Topics/Topic/Edit/__test__/Edit.spec.tsx index 83214b376c8..02e11b9c4cf 100644 --- a/kafka-ui-react-app/src/components/Topics/Topic/Edit/__test__/Edit.spec.tsx +++ b/kafka-ui-react-app/src/components/Topics/Topic/Edit/__test__/Edit.spec.tsx @@ -117,13 +117,15 @@ describe('Edit Component', () => { renderComponent({ updateTopic: updateTopicMock }, undefined); const btn = screen.getAllByText(/Save/i)[0]; - expect(btn).toBeEnabled(); await act(() => { userEvent.type( screen.getByPlaceholderText('Min In Sync Replicas'), '1' ); + }); + + await act(() => { userEvent.click(btn); }); expect(updateTopicMock).toHaveBeenCalledTimes(1); @@ -145,6 +147,8 @@ describe('Edit Component', () => { screen.getByPlaceholderText('Min In Sync Replicas'), '1' ); + }); + await act(() => { userEvent.click(btn); }); expect(updateTopicMock).toHaveBeenCalledTimes(1); diff --git a/kafka-ui-react-app/src/components/Topics/shared/Form/TopicForm.tsx b/kafka-ui-react-app/src/components/Topics/shared/Form/TopicForm.tsx index d9d6427a25c..d2cb51fae5d 100644 --- a/kafka-ui-react-app/src/components/Topics/shared/Form/TopicForm.tsx +++ b/kafka-ui-react-app/src/components/Topics/shared/Form/TopicForm.tsx @@ -51,14 +51,14 @@ const TopicForm: React.FC<Props> = ({ }) => { const { control, - formState: { errors }, + formState: { errors, isDirty, isValid }, } = useFormContext(); const getCleanUpPolicy = CleanupPolicyOptions.find((option: SelectOption) => { return option.value === cleanUpPolicy?.toLowerCase(); })?.value || CleanupPolicyOptions[0].value; return ( - <StyledForm onSubmit={onSubmit}> + <StyledForm onSubmit={onSubmit} aria-label="topic form"> <fieldset disabled={isSubmitting}> <fieldset disabled={isEditing}> <S.Column> @@ -125,10 +125,10 @@ const TopicForm: React.FC<Props> = ({ placeholder="Min In Sync Replicas" min="1" defaultValue={inSyncReplicas} - name="minInsyncReplicas" + name="minInSyncReplicas" /> <FormError> - <ErrorMessage errors={errors} name="minInsyncReplicas" /> + <ErrorMessage errors={errors} name="minInSyncReplicas" /> </FormError> </div> <div> @@ -209,7 +209,12 @@ const TopicForm: React.FC<Props> = ({ <S.CustomParamsHeading>Custom parameters</S.CustomParamsHeading> <CustomParams isSubmitting={isSubmitting} /> <S.ButtonWrapper> - <Button type="submit" buttonType="primary" buttonSize="L"> + <Button + type="submit" + buttonType="primary" + buttonSize="L" + disabled={!isValid || isSubmitting || !isDirty} + > {isEditing ? 'Save' : 'Create topic'} </Button> <Button type="button" buttonType="primary" buttonSize="L"> diff --git a/kafka-ui-react-app/src/components/Topics/shared/Form/__tests__/TopicForm.spec.tsx b/kafka-ui-react-app/src/components/Topics/shared/Form/__tests__/TopicForm.spec.tsx index b224c8cd8e1..fa12e8728c1 100644 --- a/kafka-ui-react-app/src/components/Topics/shared/Form/__tests__/TopicForm.spec.tsx +++ b/kafka-ui-react-app/src/components/Topics/shared/Form/__tests__/TopicForm.spec.tsx @@ -1,9 +1,10 @@ import React, { PropsWithChildren } from 'react'; import { render } from 'lib/testHelpers'; -import { screen } from '@testing-library/dom'; +import { fireEvent, screen } from '@testing-library/dom'; import { FormProvider, useForm } from 'react-hook-form'; import TopicForm, { Props } from 'components/Topics/shared/Form/TopicForm'; import userEvent from '@testing-library/user-event'; +import { act } from 'react-dom/test-utils'; const isSubmitting = false; const onSubmit = jest.fn(); @@ -60,12 +61,19 @@ describe('TopicForm', () => { expectByRoleAndNameToBeInDocument('button', 'Create topic'); }); - it('submits', () => { + it('submits', async () => { renderComponent({ isSubmitting, onSubmit: onSubmit.mockImplementation((e) => e.preventDefault()), }); + await act(() => { + userEvent.type(screen.getByPlaceholderText('Topic Name'), 'topicName'); + }); + await act(() => { + fireEvent.submit(screen.getByLabelText('topic form')); + }); + userEvent.click(screen.getByRole('button', { name: 'Create topic' })); expect(onSubmit).toBeCalledTimes(1); }); diff --git a/kafka-ui-react-app/src/lib/yupExtended.ts b/kafka-ui-react-app/src/lib/yupExtended.ts index db185821095..967bc0bbafe 100644 --- a/kafka-ui-react-app/src/lib/yupExtended.ts +++ b/kafka-ui-react-app/src/lib/yupExtended.ts @@ -62,7 +62,7 @@ export const topicFormValidationSchema = yup.object().shape({ .min(1) .required() .typeError('Replication factor is required and must be a number'), - minInsyncReplicas: yup + minInSyncReplicas: yup .number() .min(1) .required() diff --git a/kafka-ui-react-app/src/redux/interfaces/topic.ts b/kafka-ui-react-app/src/redux/interfaces/topic.ts index 647b0e69f9c..13aa74cf407 100644 --- a/kafka-ui-react-app/src/redux/interfaces/topic.ts +++ b/kafka-ui-react-app/src/redux/interfaces/topic.ts @@ -64,7 +64,7 @@ export interface TopicFormDataRaw { name: string; partitions: number; replicationFactor: number; - minInsyncReplicas: number; + minInSyncReplicas: number; cleanupPolicy: string; retentionMs: number; retentionBytes: number; @@ -76,7 +76,7 @@ export interface TopicFormData { name: string; partitions: number; replicationFactor: number; - minInsyncReplicas: number; + minInSyncReplicas: number; cleanupPolicy: string; retentionMs: number; retentionBytes: number; diff --git a/kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts b/kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts index 9d6ccc8e596..b1bcae7ebc4 100644 --- a/kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts +++ b/kafka-ui-react-app/src/redux/reducers/topics/__test__/reducer.spec.ts @@ -806,7 +806,7 @@ describe('topics Slice', () => { name: 'newTopic', partitions: 0, replicationFactor: 0, - minInsyncReplicas: 0, + minInSyncReplicas: 0, cleanupPolicy: 'DELETE', retentionMs: 1, retentionBytes: 1, @@ -864,7 +864,7 @@ describe('topics Slice', () => { name: topicName, partitions: 0, replicationFactor: 0, - minInsyncReplicas: 0, + minInSyncReplicas: 0, cleanupPolicy: 'DELETE', retentionMs: 0, retentionBytes: 0, diff --git a/kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts b/kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts index 2a13822b764..74362e1012d 100644 --- a/kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts +++ b/kafka-ui-react-app/src/redux/reducers/topics/topicsSlice.ts @@ -92,7 +92,7 @@ export const formatTopicCreation = (form: TopicFormData): TopicCreation => { retentionBytes, retentionMs, maxMessageBytes, - minInsyncReplicas, + minInSyncReplicas, customParams, } = form; @@ -105,7 +105,7 @@ export const formatTopicCreation = (form: TopicFormData): TopicCreation => { 'retention.ms': retentionMs.toString(), 'retention.bytes': retentionBytes.toString(), 'max.message.bytes': maxMessageBytes.toString(), - 'min.insync.replicas': minInsyncReplicas.toString(), + 'min.insync.replicas': minInSyncReplicas.toString(), ...Object.values(customParams || {}).reduce(topicReducer, {}), }, }; @@ -153,7 +153,7 @@ const formatTopicUpdate = (form: TopicFormDataRaw): TopicUpdate => { retentionBytes, retentionMs, maxMessageBytes, - minInsyncReplicas, + minInSyncReplicas, customParams, } = form; @@ -163,7 +163,7 @@ const formatTopicUpdate = (form: TopicFormDataRaw): TopicUpdate => { 'retention.ms': retentionMs, 'retention.bytes': retentionBytes, 'max.message.bytes': maxMessageBytes, - 'min.insync.replicas': minInsyncReplicas, + 'min.insync.replicas': minInSyncReplicas, ...Object.values(customParams || {}).reduce(topicReducer, {}), }, };
null
train
train
2022-07-18T12:57:28
"2022-05-26T11:46:32Z"
armenuikafka
train
provectus/kafka-ui/1775_2247
provectus/kafka-ui
provectus/kafka-ui/1775
provectus/kafka-ui/2247
[ "timestamp(timedelta=0.0, similarity=0.9150166619778775)", "connected" ]
83222edc62ef2ab370ace84df506edb225515115
cbd4e4a52adf8ca7b15c84a5c18331d6359eb51e
[ "Hello there niyatidoshi! πŸ‘‹\n\nThank you and congratulations πŸŽ‰ for opening your very first issue in this project! πŸ’–\n\nIn case you want to claim this issue, please comment down below! We will try to get back to you as soon as we can. πŸ‘€", "Hey, thanks for reaching out.\r\n\r\nIf your ksqldb server uses basic auth, unfortunately we currently do not support auth. We're planning on implementing this in future releases.", "Hey Haarolean,\r\n\r\nThanks for the update. Our KSql DB is in Confluent Cloud. Here is the doc that was followed https://docs.confluent.io/cloud/current/get-started/ksql.html and for the 1st step (Access Control) we chose Granular access. For the connectivity, an additional Key/Secret was created. Hope this helps.\r\n\r\nCould you pls include this feature in your future release?\r\n\r\n\r\nThank you.", "@niyatidoshi we'll try to for sure :) thanks for the instructions", "When will this be released :/", "> When will this be released :/\r\n\r\n@Diyaa1 pull `master`-labeled image. Caution: it's a dev branch." ]
[ "let's extract both username and password as variables here", "let's add `@ToString` and exclude password field", "the same with tostring" ]
"2022-07-05T08:42:58Z"
[ "type/enhancement", "good first issue", "scope/backend", "status/accepted" ]
KSQLDB: Support Basic Auth
Hi Team, I'm trying to connect KSQL DB which is on Confluent Cloud but I'm getting error "Failed to fetch tables and streams". I've added below property in Docker-Compose file, KAFKA_CLUSTERS_0_KSQLDBSERVER: https://test-db-xxx.confluent.cloud To connect that DB, I guess we need to provide a Key/Secret and I don't see properties to provide the same. Could you pls advise, what am I missing here? Thank you.
[ "README.md", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/client/KsqlClient.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/ClustersProperties.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/mapper/ClusterMapper.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/KafkaCluster.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/KsqlService.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlApiClient.java" ]
[ "README.md", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/client/KsqlClient.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/ClustersProperties.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/mapper/ClusterMapper.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/InternalKsqlServer.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/KafkaCluster.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/KsqlService.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlApiClient.java" ]
[ "kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/KsqlServiceTest.java", "kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/ksql/KsqlApiClientTest.java", "kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/ksql/KsqlServiceV2Test.java" ]
diff --git a/README.md b/README.md index 5a0f51cf07e..494909e6730 100644 --- a/README.md +++ b/README.md @@ -163,6 +163,8 @@ For example, if you want to use an environment variable to set the `name` parame |`KAFKA_CLUSTERS_0_NAME` | Cluster name |`KAFKA_CLUSTERS_0_BOOTSTRAPSERVERS` |Address where to connect |`KAFKA_CLUSTERS_0_KSQLDBSERVER` | KSQL DB server address +|`KAFKA_CLUSTERS_0_KSQLDBSERVERAUTH_USERNAME` | KSQL DB server's basic authentication username +|`KAFKA_CLUSTERS_0_KSQLDBSERVERAUTH_PASSWORD` | KSQL DB server's basic authentication password |`KAFKA_CLUSTERS_0_PROPERTIES_SECURITY_PROTOCOL` |Security protocol to connect to the brokers. For SSL connection use "SSL", for plaintext connection don't set this environment variable |`KAFKA_CLUSTERS_0_SCHEMAREGISTRY` |SchemaRegistry's address |`KAFKA_CLUSTERS_0_SCHEMAREGISTRYAUTH_USERNAME` |SchemaRegistry's basic authentication username diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/client/KsqlClient.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/client/KsqlClient.java index 2e8026648d7..8d051234abb 100644 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/client/KsqlClient.java +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/client/KsqlClient.java @@ -3,7 +3,9 @@ import com.fasterxml.jackson.databind.JsonNode; import com.fasterxml.jackson.databind.ObjectMapper; import com.provectus.kafka.ui.exception.UnprocessableEntityException; +import com.provectus.kafka.ui.model.KafkaCluster; import com.provectus.kafka.ui.model.KsqlCommandResponseDTO; +import com.provectus.kafka.ui.service.ksql.KsqlApiClient; import com.provectus.kafka.ui.strategy.ksql.statement.BaseStrategy; import lombok.RequiredArgsConstructor; import lombok.SneakyThrows; @@ -23,9 +25,10 @@ public class KsqlClient { private final WebClient webClient; private final ObjectMapper mapper; - public Mono<KsqlCommandResponseDTO> execute(BaseStrategy ksqlStatement) { + public Mono<KsqlCommandResponseDTO> execute(BaseStrategy ksqlStatement, KafkaCluster cluster) { return webClient.post() .uri(ksqlStatement.getUri()) + .headers(httpHeaders -> KsqlApiClient.setBasicAuthIfEnabled(httpHeaders, cluster)) .accept(new MediaType("application", "vnd.ksql.v1+json")) .body(BodyInserters.fromValue(ksqlStatement.getKsqlCommand())) .retrieve() diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/ClustersProperties.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/ClustersProperties.java index 0d83e143c5d..de4c8bc3477 100644 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/ClustersProperties.java +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/ClustersProperties.java @@ -8,6 +8,7 @@ import java.util.Set; import javax.annotation.PostConstruct; import lombok.Data; +import lombok.ToString; import org.springframework.boot.context.properties.ConfigurationProperties; import org.springframework.context.annotation.Configuration; import org.springframework.util.StringUtils; @@ -26,6 +27,7 @@ public static class Cluster { String schemaRegistry; SchemaRegistryAuth schemaRegistryAuth; String ksqldbServer; + KsqldbServerAuth ksqldbServerAuth; String schemaNameTemplate = "%s-value"; String keySchemaNameTemplate = "%s-key"; String protobufFile; @@ -57,6 +59,13 @@ public static class SchemaRegistryAuth { String password; } + @Data + @ToString(exclude = "password") + public static class KsqldbServerAuth { + String username; + String password; + } + @PostConstruct public void validateAndSetDefaults() { validateClusterNames(); diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/mapper/ClusterMapper.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/mapper/ClusterMapper.java index 1eb6199f96c..5709709e9b4 100644 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/mapper/ClusterMapper.java +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/mapper/ClusterMapper.java @@ -17,6 +17,7 @@ import com.provectus.kafka.ui.model.InternalBrokerConfig; import com.provectus.kafka.ui.model.InternalBrokerDiskUsage; import com.provectus.kafka.ui.model.InternalClusterState; +import com.provectus.kafka.ui.model.InternalKsqlServer; import com.provectus.kafka.ui.model.InternalPartition; import com.provectus.kafka.ui.model.InternalReplica; import com.provectus.kafka.ui.model.InternalSchemaRegistry; @@ -53,6 +54,7 @@ public interface ClusterMapper { @Mapping(target = "protobufFile", source = "protobufFile", qualifiedByName = "resolvePath") @Mapping(target = "properties", source = "properties", qualifiedByName = "setProperties") @Mapping(target = "schemaRegistry", source = ".", qualifiedByName = "setSchemaRegistry") + @Mapping(target = "ksqldbServer", source = ".", qualifiedByName = "setKsqldbServer") KafkaCluster toKafkaCluster(ClustersProperties.Cluster clusterProperties); ClusterStatsDTO toClusterStats(InternalClusterState clusterState); @@ -110,6 +112,24 @@ default InternalSchemaRegistry setSchemaRegistry(ClustersProperties.Cluster clus return internalSchemaRegistry.build(); } + @Named("setKsqldbServer") + default InternalKsqlServer setKsqldbServer(ClustersProperties.Cluster clusterProperties) { + if (clusterProperties == null + || clusterProperties.getKsqldbServer() == null) { + return null; + } + + InternalKsqlServer.InternalKsqlServerBuilder internalKsqlServerBuilder = + InternalKsqlServer.builder().url(clusterProperties.getKsqldbServer()); + + if (clusterProperties.getKsqldbServerAuth() != null) { + internalKsqlServerBuilder.username(clusterProperties.getKsqldbServerAuth().getUsername()); + internalKsqlServerBuilder.password(clusterProperties.getKsqldbServerAuth().getPassword()); + } + + return internalKsqlServerBuilder.build(); + } + TopicDetailsDTO toTopicDetails(InternalTopic topic); @Mapping(target = "isReadOnly", source = "readOnly") diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/InternalKsqlServer.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/InternalKsqlServer.java new file mode 100644 index 00000000000..a1c715bb586 --- /dev/null +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/InternalKsqlServer.java @@ -0,0 +1,14 @@ +package com.provectus.kafka.ui.model; + +import lombok.Builder; +import lombok.Data; +import lombok.ToString; + +@Data +@ToString(exclude = "password") +@Builder(toBuilder = true) +public class InternalKsqlServer { + private final String url; + private final String username; + private final String password; +} diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/KafkaCluster.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/KafkaCluster.java index b9f1ea96768..eab02e789f0 100644 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/KafkaCluster.java +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/KafkaCluster.java @@ -21,7 +21,7 @@ public class KafkaCluster { private final String jmxPassword; private final String bootstrapServers; private final InternalSchemaRegistry schemaRegistry; - private final String ksqldbServer; + private final InternalKsqlServer ksqldbServer; private final List<KafkaConnectCluster> kafkaConnect; private final String schemaNameTemplate; private final String keySchemaNameTemplate; diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/KsqlService.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/KsqlService.java index 4e970dc0c8c..6f74ede75ec 100644 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/KsqlService.java +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/KsqlService.java @@ -28,10 +28,10 @@ public Mono<KsqlCommandResponseDTO> executeKsqlCommand(KafkaCluster cluster, e instanceof ClusterNotFoundException ? e : new KsqlDbNotFoundException(); return Mono.error(throwable); }) - .flatMap(host -> getStatementStrategyForKsqlCommand(ksqlCommand) - .map(statement -> statement.host(host)) + .flatMap(ksqlServer -> getStatementStrategyForKsqlCommand(ksqlCommand) + .map(statement -> statement.host(ksqlServer.getUrl())) ) - .flatMap(ksqlClient::execute); + .flatMap(baseStrategy -> ksqlClient.execute(baseStrategy, cluster)); } private Mono<BaseStrategy> getStatementStrategyForKsqlCommand( diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlApiClient.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlApiClient.java index 161c284c972..b83da666aaf 100644 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlApiClient.java +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ksql/KsqlApiClient.java @@ -8,6 +8,7 @@ import com.fasterxml.jackson.databind.JsonNode; import com.fasterxml.jackson.databind.ObjectMapper; import com.fasterxml.jackson.databind.node.TextNode; +import com.provectus.kafka.ui.exception.ValidationException; import com.provectus.kafka.ui.model.KafkaCluster; import com.provectus.kafka.ui.service.ksql.response.ResponseParser; import java.util.List; @@ -18,6 +19,7 @@ import lombok.Value; import lombok.extern.slf4j.Slf4j; import org.springframework.core.codec.DecodingException; +import org.springframework.http.HttpHeaders; import org.springframework.http.MediaType; import org.springframework.http.codec.json.Jackson2JsonDecoder; import org.springframework.util.MimeTypeUtils; @@ -79,12 +81,25 @@ private WebClient webClient() { .build(); return WebClient.builder() .codecs(c -> c.defaultCodecs().maxInMemorySize((int) maxBuffSize.toBytes())) + .defaultHeaders(httpHeaders -> setBasicAuthIfEnabled(httpHeaders, cluster)) .exchangeStrategies(exchangeStrategies) .build(); } + public static void setBasicAuthIfEnabled(HttpHeaders headers, KafkaCluster cluster) { + String username = cluster.getKsqldbServer().getUsername(); + String password = cluster.getKsqldbServer().getPassword(); + if (username != null && password != null) { + headers.setBasicAuth(username, password); + } else if (username != null) { + throw new ValidationException("You specified username but did not specify password"); + } else if (password != null) { + throw new ValidationException("You specified password but did not specify username"); + } + } + private String baseKsqlDbUri() { - return cluster.getKsqldbServer(); + return cluster.getKsqldbServer().getUrl(); } private KsqlRequest ksqlRequest(String ksql, Map<String, String> streamProperties) {
diff --git a/kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/KsqlServiceTest.java b/kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/KsqlServiceTest.java index f41b595e791..ed434efba44 100644 --- a/kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/KsqlServiceTest.java +++ b/kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/KsqlServiceTest.java @@ -2,6 +2,7 @@ import static org.assertj.core.api.Assertions.assertThat; import static org.mockito.ArgumentMatchers.any; +import static org.mockito.ArgumentMatchers.eq; import static org.mockito.Mockito.times; import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; @@ -9,6 +10,7 @@ import com.provectus.kafka.ui.client.KsqlClient; import com.provectus.kafka.ui.exception.KsqlDbNotFoundException; import com.provectus.kafka.ui.exception.UnprocessableEntityException; +import com.provectus.kafka.ui.model.InternalKsqlServer; import com.provectus.kafka.ui.model.KafkaCluster; import com.provectus.kafka.ui.model.KsqlCommandDTO; import com.provectus.kafka.ui.model.KsqlCommandResponseDTO; @@ -62,7 +64,7 @@ void shouldThrowUnprocessableEntityExceptionOnExecuteKsqlCommand() { KsqlCommandDTO command = (new KsqlCommandDTO()).ksql("CREATE STREAM users WITH (KAFKA_TOPIC='users');"); KafkaCluster kafkaCluster = Mockito.mock(KafkaCluster.class); - when(kafkaCluster.getKsqldbServer()).thenReturn("localhost:8088"); + when(kafkaCluster.getKsqldbServer()).thenReturn(InternalKsqlServer.builder().url("localhost:8088").build()); StepVerifier.create(ksqlService.executeKsqlCommand(kafkaCluster, Mono.just(command))) .verifyError(UnprocessableEntityException.class); @@ -77,8 +79,8 @@ void shouldSetHostToStrategy() { KsqlCommandDTO command = (new KsqlCommandDTO()).ksql("describe streams;"); KafkaCluster kafkaCluster = Mockito.mock(KafkaCluster.class); - when(kafkaCluster.getKsqldbServer()).thenReturn(host); - when(ksqlClient.execute(any())).thenReturn(Mono.just(new KsqlCommandResponseDTO())); + when(kafkaCluster.getKsqldbServer()).thenReturn(InternalKsqlServer.builder().url(host).build()); + when(ksqlClient.execute(any(), any())).thenReturn(Mono.just(new KsqlCommandResponseDTO())); ksqlService.executeKsqlCommand(kafkaCluster, Mono.just(command)).block(); assertThat(alternativeStrategy.getUri()).isEqualTo(host + "/ksql"); @@ -90,12 +92,12 @@ void shouldCallClientAndReturnResponse() { KafkaCluster kafkaCluster = Mockito.mock(KafkaCluster.class); KsqlCommandResponseDTO response = new KsqlCommandResponseDTO().message("success"); - when(kafkaCluster.getKsqldbServer()).thenReturn("host"); - when(ksqlClient.execute(any())).thenReturn(Mono.just(response)); + when(kafkaCluster.getKsqldbServer()).thenReturn(InternalKsqlServer.builder().url("host").build()); + when(ksqlClient.execute(any(), any())).thenReturn(Mono.just(response)); KsqlCommandResponseDTO receivedResponse = ksqlService.executeKsqlCommand(kafkaCluster, Mono.just(command)).block(); - verify(ksqlClient, times(1)).execute(alternativeStrategy); + verify(ksqlClient, times(1)).execute(eq(alternativeStrategy), any()); assertThat(receivedResponse).isEqualTo(response); } diff --git a/kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/ksql/KsqlApiClientTest.java b/kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/ksql/KsqlApiClientTest.java index 5956d730829..a797175091f 100644 --- a/kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/ksql/KsqlApiClientTest.java +++ b/kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/ksql/KsqlApiClientTest.java @@ -9,6 +9,7 @@ import com.fasterxml.jackson.databind.node.TextNode; import com.provectus.kafka.ui.AbstractIntegrationTest; import com.provectus.kafka.ui.container.KsqlDbContainer; +import com.provectus.kafka.ui.model.InternalKsqlServer; import com.provectus.kafka.ui.model.KafkaCluster; import java.time.Duration; import java.util.List; @@ -42,7 +43,8 @@ static void stopContainer() { // Tutorial is here: https://ksqldb.io/quickstart.html @Test void ksqTutorialQueriesWork() { - var client = new KsqlApiClient(KafkaCluster.builder().ksqldbServer(KSQL_DB.url()).build(), maxBuffSize); + var client = new KsqlApiClient(KafkaCluster.builder().ksqldbServer( + InternalKsqlServer.builder().url(KSQL_DB.url()).build()).build(), maxBuffSize); execCommandSync(client, "CREATE STREAM riderLocations (profileId VARCHAR, latitude DOUBLE, longitude DOUBLE) " + "WITH (kafka_topic='locations', value_format='json', partitions=1);", diff --git a/kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/ksql/KsqlServiceV2Test.java b/kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/ksql/KsqlServiceV2Test.java index d670680ea21..89823d643ea 100644 --- a/kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/ksql/KsqlServiceV2Test.java +++ b/kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/ksql/KsqlServiceV2Test.java @@ -4,6 +4,7 @@ import com.provectus.kafka.ui.AbstractIntegrationTest; import com.provectus.kafka.ui.container.KsqlDbContainer; +import com.provectus.kafka.ui.model.InternalKsqlServer; import com.provectus.kafka.ui.model.KafkaCluster; import com.provectus.kafka.ui.model.KsqlStreamDescriptionDTO; import com.provectus.kafka.ui.model.KsqlTableDescriptionDTO; @@ -34,7 +35,8 @@ static void init() { @AfterAll static void cleanup() { - var client = new KsqlApiClient(KafkaCluster.builder().ksqldbServer(KSQL_DB.url()).build(), maxBuffSize); + var client = new KsqlApiClient(KafkaCluster.builder().ksqldbServer( + InternalKsqlServer.builder().url(KSQL_DB.url()).build()).build(), maxBuffSize); TABLES_TO_DELETE.forEach(t -> client.execute(String.format("DROP TABLE IF EXISTS %s DELETE TOPIC;", t), Map.of()) @@ -51,7 +53,7 @@ static void cleanup() { @Test void listStreamsReturnsAllKsqlStreams() { - var cluster = KafkaCluster.builder().ksqldbServer(KSQL_DB.url()).build(); + var cluster = KafkaCluster.builder().ksqldbServer(InternalKsqlServer.builder().url(KSQL_DB.url()).build()).build(); var streamName = "stream_" + System.currentTimeMillis(); STREAMS_TO_DELETE.add(streamName); @@ -80,7 +82,7 @@ void listStreamsReturnsAllKsqlStreams() { @Test void listTablesReturnsAllKsqlTables() { - var cluster = KafkaCluster.builder().ksqldbServer(KSQL_DB.url()).build(); + var cluster = KafkaCluster.builder().ksqldbServer(InternalKsqlServer.builder().url(KSQL_DB.url()).build()).build(); var tableName = "table_" + System.currentTimeMillis(); TABLES_TO_DELETE.add(tableName);
val
train
2022-07-06T07:10:40
"2022-03-29T12:47:38Z"
niyatidoshi
train
provectus/kafka-ui/2212_2259
provectus/kafka-ui
provectus/kafka-ui/2212
provectus/kafka-ui/2259
[ "connected", "timestamp(timedelta=0.0, similarity=0.8435926964333799)" ]
0b76b1251859f7acc4daf4afd746b81945b7c720
3ab44233edc663ffdc03c08e315834cfd14ff1d0
[ "@Haarolean is it OK to push it to internal ECR ? we can re-use our branch-deployment logic to build image only. How you would like to trigger such logic ?", "Yes, yes, and, the same way, via labels." ]
[]
"2022-07-12T10:05:42Z"
[ "good first issue", "status/accepted", "scope/infrastructure", "type/feature" ]
Implement a flow to publish temporary docker images
In some cases we need to build an image within a PR to let others test them out. Would be nice to build and publish them automatically.
[]
[ ".github/workflows/build-public-image.yml", ".github/workflows/delete-public-image.yml" ]
[]
diff --git a/.github/workflows/build-public-image.yml b/.github/workflows/build-public-image.yml new file mode 100644 index 00000000000..32984dadad7 --- /dev/null +++ b/.github/workflows/build-public-image.yml @@ -0,0 +1,78 @@ +name: Build Docker image and push +on: + workflow_dispatch: + pull_request: + types: ['labeled'] +jobs: + build: + if: ${{ github.event.label.name == 'status/image_testing' }} + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v3 + - name: get branch name + id: extract_branch + run: | + tag='${{ github.event.pull_request.number }}' + echo ::set-output name=tag::${tag} + - name: Cache local Maven repository + uses: actions/cache@v3 + with: + path: ~/.m2/repository + key: ${{ runner.os }}-maven-${{ hashFiles('**/pom.xml') }} + restore-keys: | + ${{ runner.os }}-maven- + - name: Set up JDK 1.13 + uses: actions/setup-java@v1 + with: + java-version: 1.13 + - name: Build + id: build + run: | + mvn versions:set -DnewVersion=$GITHUB_SHA + mvn clean package -Pprod -DskipTests + export VERSION=$(mvn -q -Dexec.executable=echo -Dexec.args='${project.version}' --non-recursive exec:exec) + echo "::set-output name=version::${VERSION}" + - name: Set up QEMU + uses: docker/setup-qemu-action@v1 + - name: Set up Docker Buildx + id: buildx + uses: docker/setup-buildx-action@v1 + - name: Cache Docker layers + uses: actions/cache@v3 + with: + path: /tmp/.buildx-cache + key: ${{ runner.os }}-buildx-${{ github.sha }} + restore-keys: | + ${{ runner.os }}-buildx- + - name: Configure AWS credentials for Kafka-UI account + uses: aws-actions/configure-aws-credentials@v1 + with: + aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }} + aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }} + aws-region: us-east-1 + - name: Login to Amazon ECR + id: login-ecr + uses: aws-actions/amazon-ecr-login@v1 + with: + registry-type: 'public' + - name: Build and push + id: docker_build_and_push + uses: docker/build-push-action@v2 + with: + builder: ${{ steps.buildx.outputs.name }} + context: kafka-ui-api + push: true + tags: public.ecr.aws/provectus/kafka-ui-custom-build:${{ steps.extract_branch.outputs.tag }} + build-args: | + JAR_FILE=kafka-ui-api-${{ steps.build.outputs.version }}.jar + cache-from: type=local,src=/tmp/.buildx-cache + cache-to: type=local,dest=/tmp/.buildx-cache + - name: make comment with private deployment link + uses: peter-evans/create-or-update-comment@v2 + with: + issue-number: ${{ github.event.pull_request.number }} + body: | + Image published at public.ecr.aws/provectus/kafka-ui-custom-build:${{ steps.extract_branch.outputs.tag }} + + outputs: + tag: ${{ steps.extract_branch.outputs.tag }} \ No newline at end of file diff --git a/.github/workflows/delete-public-image.yml b/.github/workflows/delete-public-image.yml new file mode 100644 index 00000000000..ea185917410 --- /dev/null +++ b/.github/workflows/delete-public-image.yml @@ -0,0 +1,40 @@ +name: Delete Public ECR Image +on: + workflow_dispatch: + pull_request: + types: ['unlabeled', 'closed'] +jobs: + remove: + if: ${{ github.event.label.name == 'status/image_testing' || ( github.event.action == 'closed' && (contains(github.event.pull_request.labels, 'status/image_testing'))) }} + runs-on: ubuntu-latest + steps: + - name: get branch name + id: extract_branch + run: | + echo + tag='${{ github.event.pull_request.number }}' + echo ::set-output name=tag::${tag} + - name: Configure AWS credentials for Kafka-UI account + uses: aws-actions/configure-aws-credentials@v1 + with: + aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }} + aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }} + aws-region: us-east-1 + - name: Login to Amazon ECR + id: login-ecr + uses: aws-actions/amazon-ecr-login@v1 + with: + registry-type: 'public' + - name: Remove from ECR + id: remove_from_ecr + run: | + aws ecr-public batch-delete-image \ + --repository-name kafka-ui-custom-build \ + --image-ids imageTag=${{ steps.extract_branch.outputs.tag }} \ + --region us-east-1 + - name: make comment with private deployment link + uses: peter-evans/create-or-update-comment@v2 + with: + issue-number: ${{ github.event.pull_request.number }} + body: | + Image tag public.ecr.aws/provectus/kafka-ui-custom-build:${{ steps.extract_branch.outputs.tag }} has been removed \ No newline at end of file
null
test
train
2022-07-12T10:40:52
"2022-06-27T17:43:19Z"
Haarolean
train
provectus/kafka-ui/2163_2277
provectus/kafka-ui
provectus/kafka-ui/2163
provectus/kafka-ui/2277
[ "connected" ]
edb7da6fce83e149b9e9c34182708f17f1c1240c
085dfec389296597d8ca595f837809b3b130ee63
[ "frontend requirements: #2566", "Hey @Haarolean I would like to take this up. Do we want to give the users a limited set of predefined formats, or do we want to give them full flexibility to input a date format which we can use to safely parse any date to?", "I think giving them a limited set of formats will be better as we will reduce uncertainty at runtime. And, that set of formats can come from a standard set of formats used around the globe.\r\n\r\nWe can also do a hybrid, where we provide a few standard formats & also let the user input their own format as text input. To reduce complexity we can parse and validate that date format and provide an appropriate error to the user.\r\n", "Discussed on discord: needs discussion\r\n\r\nthere are a few problems with this one:\r\n1. we don't have a design solution for that one, yet\r\n2. Not sure that we'd need this any longer since we implemented a dynamic timestamp format depending on the users' locale (US vs non-US), which looks like it's enough for now\r\n\r\n", "It would be useful if more granular (sub-second, like milliseconds) timestamp information could be displayed too" ]
[ "```suggestion\r\n /api/info/timestampformat:\r\n```", "- that's not related to kafka, let's make it `timestamp.format` for example\r\n- D is a day of the year, we need dd\r\n- also, let's use ddMM as a default one instead of MMdd\r\n", "Updated PR please check", "Updated PR please check", "nice" ]
"2022-07-15T11:39:29Z"
[ "type/enhancement", "good first issue", "scope/frontend", "status/accepted", "status/needs-attention" ]
Make message timestamp format configurable
We changed default date format as requested in #2162. We need the datetime format configurable on the frontend for the ones who prefer the US format or anything else.
[ "kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml" ]
[ "kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/InfoController.java", "kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml" ]
[]
diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/InfoController.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/InfoController.java new file mode 100644 index 00000000000..66e5d70bd33 --- /dev/null +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/InfoController.java @@ -0,0 +1,25 @@ +package com.provectus.kafka.ui.controller; + +import com.provectus.kafka.ui.api.TimeStampFormatApi; +import com.provectus.kafka.ui.model.TimeStampFormatDTO; +import lombok.RequiredArgsConstructor; +import lombok.extern.slf4j.Slf4j; +import org.springframework.beans.factory.annotation.Value; +import org.springframework.http.ResponseEntity; +import org.springframework.web.bind.annotation.RestController; +import org.springframework.web.server.ServerWebExchange; +import reactor.core.publisher.Mono; + +@RestController +@RequiredArgsConstructor +@Slf4j +public class InfoController extends AbstractController implements TimeStampFormatApi { + + @Value("${timestamp.format:dd.MM.YYYY HH:mm:ss}") + private String timeStampFormat; + + @Override + public Mono<ResponseEntity<TimeStampFormatDTO>> getTimeStampFormat(ServerWebExchange exchange) { + return Mono.just(ResponseEntity.ok(new TimeStampFormatDTO().timeStampFormat(timeStampFormat))); + } +} diff --git a/kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml b/kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml index 63f313e2ac1..f7c7fa1ec13 100644 --- a/kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml +++ b/kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml @@ -1748,7 +1748,19 @@ paths: $ref: '#/components/schemas/PartitionsIncreaseResponse' 404: description: Not found - + /api/info/timestampformat: + get: + tags: + - TimeStampFormat + summary: getTimeStampFormat + operationId: getTimeStampFormat + responses: + 200: + description: OK + content: + application/json: + schema: + $ref: '#/components/schemas/TimeStampFormat' components: schemas: ErrorResponse: @@ -2351,6 +2363,12 @@ components: name: type: string + TimeStampFormat: + type: object + properties: + timeStampFormat: + type: string + TopicMessageConsuming: type: object properties:
null
train
train
2022-08-04T15:38:27
"2022-06-14T11:30:27Z"
Haarolean
train
provectus/kafka-ui/2076_2291
provectus/kafka-ui
provectus/kafka-ui/2076
provectus/kafka-ui/2291
[ "connected", "timestamp(timedelta=1.0, similarity=0.9685220438654787)" ]
37f1d2254e66b31bc827438b38f1505ae331c415
2b5dd270e22c6455c6a5999131dbf3ce3b770429
[ "Implement sorting by: name, connect, type and status.", "Sorting by Name, Connect, Type, Plugin, Status implemented." ]
[]
"2022-07-18T15:30:43Z"
[ "type/enhancement", "good first issue", "scope/backend", "scope/frontend", "status/accepted", "hacktoberfest", "status/pending-frontend" ]
Implement connectors sorting
1. Migrate table to new table component 2. Implement sorting 3. Related issue https://github.com/provectus/kafka-ui/issues/2325 The list is unsorted at all <img width="299" alt="image" src="https://user-images.githubusercontent.com/1494347/171179908-5447271c-e17d-4921-abfe-3a7e4239756a.png">
[ "kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/KafkaConnectController.java", "kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml" ]
[ "kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/KafkaConnectController.java", "kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml" ]
[]
diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/KafkaConnectController.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/KafkaConnectController.java index 8011fe8e5fc..2813618abf9 100644 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/KafkaConnectController.java +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/KafkaConnectController.java @@ -3,13 +3,16 @@ import com.provectus.kafka.ui.api.KafkaConnectApi; import com.provectus.kafka.ui.model.ConnectDTO; import com.provectus.kafka.ui.model.ConnectorActionDTO; +import com.provectus.kafka.ui.model.ConnectorColumnsToSortDTO; import com.provectus.kafka.ui.model.ConnectorDTO; import com.provectus.kafka.ui.model.ConnectorPluginConfigValidationResponseDTO; import com.provectus.kafka.ui.model.ConnectorPluginDTO; import com.provectus.kafka.ui.model.FullConnectorInfoDTO; import com.provectus.kafka.ui.model.NewConnectorDTO; +import com.provectus.kafka.ui.model.SortOrderDTO; import com.provectus.kafka.ui.model.TaskDTO; import com.provectus.kafka.ui.service.KafkaConnectService; +import java.util.Comparator; import java.util.Map; import javax.validation.Valid; import lombok.RequiredArgsConstructor; @@ -68,10 +71,16 @@ public Mono<ResponseEntity<Void>> deleteConnector(String clusterName, String con public Mono<ResponseEntity<Flux<FullConnectorInfoDTO>>> getAllConnectors( String clusterName, String search, + ConnectorColumnsToSortDTO orderBy, + SortOrderDTO sortOrder, ServerWebExchange exchange ) { + var comparator = sortOrder == null || sortOrder.equals(SortOrderDTO.ASC) + ? getConnectorsComparator(orderBy) + : getConnectorsComparator(orderBy).reversed(); return Mono.just(ResponseEntity.ok( - kafkaConnectService.getAllConnectors(getCluster(clusterName), search))); + kafkaConnectService.getAllConnectors(getCluster(clusterName), search).sort(comparator)) + ); } @Override @@ -142,4 +151,22 @@ public Mono<ResponseEntity<Flux<ConnectorPluginDTO>>> getConnectorPlugins( getCluster(clusterName), connectName, pluginName, requestBody) .map(ResponseEntity::ok); } + + private Comparator<FullConnectorInfoDTO> getConnectorsComparator(ConnectorColumnsToSortDTO orderBy) { + var defaultComparator = Comparator.comparing(FullConnectorInfoDTO::getName); + if (orderBy == null) { + return defaultComparator; + } + switch (orderBy) { + case CONNECT: + return Comparator.comparing(FullConnectorInfoDTO::getConnect); + case TYPE: + return Comparator.comparing(FullConnectorInfoDTO::getType); + case STATUS: + return Comparator.comparing(fullConnectorInfoDTO -> fullConnectorInfoDTO.getStatus().getState()); + case NAME: + default: + return defaultComparator; + } + } } diff --git a/kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml b/kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml index af6c6c3e6ae..d07c24a61b1 100644 --- a/kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml +++ b/kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml @@ -1267,6 +1267,16 @@ paths: required: false schema: type: string + - name: orderBy + in: query + required: false + schema: + $ref: '#/components/schemas/ConnectorColumnsToSort' + - name: sortOrder + in: query + required: false + schema: + $ref: '#/components/schemas/SortOrder' responses: 200: description: OK @@ -2009,6 +2019,14 @@ components: - REPLICATION_FACTOR - SIZE + ConnectorColumnsToSort: + type: string + enum: + - NAME + - CONNECT + - TYPE + - STATUS + SortOrder: type: string enum:
null
train
train
2022-10-14T11:01:19
"2022-05-31T13:04:31Z"
Haarolean
train
provectus/kafka-ui/1024_2305
provectus/kafka-ui
provectus/kafka-ui/1024
provectus/kafka-ui/2305
[ "connected" ]
6891f71452d2f2cb78daf7e9b21106fc571a4355
48325bc5adf446cf14763d298d19202c83d67e01
[ "1. When \"newest first\" is selected or \"submit\" button is pressed while \"newest first\" is selected we should use `seekType` as `LATEST` instead of `OFFSET`.\r\n2. The same for \"oldest first\": `seekType` should be `BEGINNING`.\r\n", "@Haarolean When we submit a message, it redirects to the topic messages page with the default params, so the seekDirection is reset to FORWARD (Oldest first). I can set the seekType to BEGINNING, but the results are the same. What about the newest first option, seekType=LATEST doesn't help, the new messages still don't appear. I think the problem comes from the back end.", "@simonyandev the first part is okay, results should be the same, adjusting the seekDirection to make it clear.\r\n\r\n@iliax any ideas about the second part?", "@simonyandev We fixed issue with wrong backend behaviour in Newest first mode, so I think you can continue work on UI", "@ssiradeghyan PTAL" ]
[]
"2022-07-20T14:33:56Z"
[ "type/bug", "good first issue", "scope/backend", "scope/frontend", "status/accepted", "status/confirmed" ]
New incoming messages not visible when "Newest first" enabled
**Describe the bug** (A clear and concise description of what the bug is.) When "Newest first" is enabled on Submit we should send request with latest partitions offsets. **Steps to Reproduce** Steps to reproduce the behavior: 1. write messages to topic 2. open Messages tab 3. enable "Newest first" mode 4. write new messages 5. click Submit 6. nothing happened **Expected behavior** (A clear and concise description of what you expected to happen) New submitted messages should be visible. To Fix it we should get latest partitions offsets and send as params **Screenshots** (If applicable, add screenshots to help explain your problem) **Additional context** (Add any other context about the problem here)
[ "kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/Filters/Filters.tsx" ]
[ "kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/Filters/Filters.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/Filters/Filters.tsx b/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/Filters/Filters.tsx index d044fefabf9..5cf9b2af573 100644 --- a/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/Filters/Filters.tsx +++ b/kafka-ui-react-app/src/components/Topics/Topic/Details/Messages/Filters/Filters.tsx @@ -192,7 +192,17 @@ const Filters: React.FC<FiltersProps> = ({ setAttempt(attempt + 1); if (isSeekTypeControlVisible) { - props.seekType = isLive ? SeekType.LATEST : currentSeekType; + switch (seekDirection) { + case SeekDirection.FORWARD: + props.seekType = SeekType.BEGINNING; + break; + case SeekDirection.BACKWARD: + case SeekDirection.TAILING: + props.seekType = SeekType.LATEST; + break; + default: + props.seekType = currentSeekType; + } props.seekTo = selectedPartitions.map(({ value }) => { const offsetProperty = seekDirection === SeekDirection.FORWARD ? 'offsetMin' : 'offsetMax';
null
val
train
2022-07-21T12:03:13
"2021-10-27T12:16:03Z"
iliax
train
provectus/kafka-ui/2039_2313
provectus/kafka-ui
provectus/kafka-ui/2039
provectus/kafka-ui/2313
[ "connected" ]
66e4eede510014aca116d034b91515f5e8be5ec1
6891f71452d2f2cb78daf7e9b21106fc571a4355
[]
[]
"2022-07-21T09:39:14Z"
[ "type/bug", "good first issue", "scope/frontend", "status/accepted", "status/confirmed" ]
Make Submit button inactive when required fields aren't filled
**Describe the bug** Submit button is inactive by default in create forms, it should stay inactive also in case of once pressing the input fields and letting it empty **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** Steps to reproduce the behavior: 1. Navigate to schema registry 2. Create schema 3. Make sure submit button is inactive 4. Click into required input field 5. Make sure submit button became active 6. Click out of the field **Expected behavior** Submit button should stay inactive if required data is missing from the forms: _**Create new Topic, Produce Message, Create new schema, Create Connector**_ **Screenshots** ![inactive submit](https://user-images.githubusercontent.com/104780608/170481725-0937c7ee-9711-4a14-82ae-00945fe626a0.png)
[ "kafka-ui-react-app/src/components/Schemas/New/New.tsx", "kafka-ui-react-app/src/components/Topics/List/ActionsCell/ActionsCell.tsx" ]
[ "kafka-ui-react-app/src/components/Schemas/New/New.tsx", "kafka-ui-react-app/src/components/Topics/List/ActionsCell/ActionsCell.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/Schemas/New/New.tsx b/kafka-ui-react-app/src/components/Schemas/New/New.tsx index 1aa843d6b59..e1474c15835 100644 --- a/kafka-ui-react-app/src/components/Schemas/New/New.tsx +++ b/kafka-ui-react-app/src/components/Schemas/New/New.tsx @@ -32,7 +32,12 @@ const New: React.FC = () => { const { clusterName } = useAppParams<ClusterNameRoute>(); const navigate = useNavigate(); const dispatch = useAppDispatch(); - const methods = useForm<NewSchemaSubjectRaw>({ mode: 'onChange' }); + const methods = useForm<NewSchemaSubjectRaw>({ + mode: 'onChange', + defaultValues: { + schemaType: SchemaType.AVRO, + }, + }); const { register, handleSubmit, diff --git a/kafka-ui-react-app/src/components/Topics/List/ActionsCell/ActionsCell.tsx b/kafka-ui-react-app/src/components/Topics/List/ActionsCell/ActionsCell.tsx index 617dc8f4928..2ab2bb7a6dd 100644 --- a/kafka-ui-react-app/src/components/Topics/List/ActionsCell/ActionsCell.tsx +++ b/kafka-ui-react-app/src/components/Topics/List/ActionsCell/ActionsCell.tsx @@ -1,10 +1,10 @@ import React from 'react'; -import { useDispatch } from 'react-redux'; import { CleanUpPolicy, SortOrder, TopicColumnsToSort, } from 'generated-sources'; +import { useAppDispatch } from 'lib/hooks/redux'; import ConfirmationModal from 'components/common/ConfirmationModal/ConfirmationModal'; import DropdownItem from 'components/common/Dropdown/DropdownItem'; import { TableCellProps } from 'components/common/SmartTable/TableColumn'; @@ -45,7 +45,7 @@ const ActionsCell: React.FC< }) => { const { isReadOnly, isTopicDeletionAllowed } = React.useContext(ClusterContext); - const dispatch = useDispatch(); + const dispatch = useAppDispatch(); const { clusterName } = useAppParams<ClusterNameRoute>(); const {
null
train
train
2022-07-21T08:08:23
"2022-05-26T11:46:32Z"
armenuikafka
train
provectus/kafka-ui/2039_2315
provectus/kafka-ui
provectus/kafka-ui/2039
provectus/kafka-ui/2315
[ "timestamp(timedelta=0.0, similarity=1.0)", "connected" ]
bffe316063e6c68b9d48ee18cdb92d2974e7c58c
d92fe63e8aaa146c2b6279d60898544bc1e6ed39
[]
[ "@rAzizbekyan pls take a look if this warning is valid", "@workshur I checked it and locally it's not showing error like this", "All added cases generate warnings. pls fix it\r\n```\r\nconsole.error\r\n Warning: An update to New inside a test was not wrapped in act(...).\r\n\r\n When testing, code that causes React state updates should be wrapped into act(...):\r\n\r\n act(() => {\r\n /* fire events that update state */\r\n });\r\n /* assert on the output */\r\n\r\n This ensures that you're testing the behavior the user would see in the browser. Learn more at https://reactjs.org/link/wrap-tests-with-act\r\n at New (/Users/workshur/repos/kafka-ui/kafka-ui-react-app/src/components/Schemas/New/New.tsx:31:27)\r\n at Routes (/Users/workshur/repos/kafka-ui/kafka-ui-react-app/node_modules/.pnpm/packages/react-router/lib/components.tsx:253:3)\r\n at WithRoute (/Users/workshur/repos/kafka-ui/kafka-ui-react-app/src/lib/testHelpers.tsx:33:55)\r\n at Router (/Users/workshur/repos/kafka-ui/kafka-ui-react-app/node_modules/.pnpm/packages/react-router/lib/components.tsx:173:13)\r\n at MemoryRouter (/Users/workshur/repos/kafka-ui/kafka-ui-react-app/node_modules/.pnpm/packages/react-router/lib/components.tsx:33:3)\r\n at QueryClientProvider (/Users/workshur/repos/kafka-ui/kafka-ui-react-app/node_modules/.pnpm/@[email protected]_ef5jwxihqo6n7gxfmzogljlgcm/node_modules/@tanstack/react-query/src/QueryClientProvider.tsx:71:3)\r\n at TestQueryClientProvider (/Users/workshur/repos/kafka-ui/kafka-ui-react-app/src/lib/testHelpers.tsx:42:3)\r\n at Provider (/Users/workshur/repos/kafka-ui/kafka-ui-react-app/node_modules/.pnpm/[email protected]_nfqigfgwurfoimtkde74cji6ga/node_modules/react-redux/lib/components/Provider.js:19:3)\r\n at Object.<anonymous>.exports.ThemeProvider (/Users/workshur/repos/kafka-ui/kafka-ui-react-app/node_modules/.pnpm/[email protected]_uuaz5p7xzfmtjacf6iqf7idnby/node_modules/styled-components/src/models/ServerStyleSheet.js:68:13)\r\n at AllTheProviders (/Users/workshur/repos/kafka-ui/kafka-ui-react-app/src/lib/testHelpers.tsx:67:5)\r\n```", "I fixed it all, I'm not sure what's the problem that it still fails as when I run tests locally everything is fine\r\n", "The issue is fixed" ]
"2022-07-21T11:39:08Z"
[ "type/bug", "good first issue", "scope/frontend", "status/accepted", "status/confirmed" ]
Make Submit button inactive when required fields aren't filled
**Describe the bug** Submit button is inactive by default in create forms, it should stay inactive also in case of once pressing the input fields and letting it empty **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** Steps to reproduce the behavior: 1. Navigate to schema registry 2. Create schema 3. Make sure submit button is inactive 4. Click into required input field 5. Make sure submit button became active 6. Click out of the field **Expected behavior** Submit button should stay inactive if required data is missing from the forms: _**Create new Topic, Produce Message, Create new schema, Create Connector**_ **Screenshots** ![inactive submit](https://user-images.githubusercontent.com/104780608/170481725-0937c7ee-9711-4a14-82ae-00945fe626a0.png)
[ "kafka-ui-react-app/src/components/Schemas/New/New.tsx", "kafka-ui-react-app/src/components/Schemas/New/__test__/New.spec.tsx" ]
[ "kafka-ui-react-app/src/components/Schemas/New/New.tsx", "kafka-ui-react-app/src/components/Schemas/New/__test__/New.spec.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/Schemas/New/New.tsx b/kafka-ui-react-app/src/components/Schemas/New/New.tsx index c3d9ba70034..38bef9bc9b6 100644 --- a/kafka-ui-react-app/src/components/Schemas/New/New.tsx +++ b/kafka-ui-react-app/src/components/Schemas/New/New.tsx @@ -105,11 +105,11 @@ const New: React.FC = () => { control={control} rules={{ required: 'Schema Type is required.' }} name="schemaType" + defaultValue={SchemaTypeOptions[0].value as SchemaType} render={({ field: { name, onChange, value } }) => ( <Select selectSize="M" name={name} - defaultValue={SchemaTypeOptions[0].value} value={value} onChange={onChange} minWidth="50%" diff --git a/kafka-ui-react-app/src/components/Schemas/New/__test__/New.spec.tsx b/kafka-ui-react-app/src/components/Schemas/New/__test__/New.spec.tsx index c4bcf77abc5..01b9b1cd8b9 100644 --- a/kafka-ui-react-app/src/components/Schemas/New/__test__/New.spec.tsx +++ b/kafka-ui-react-app/src/components/Schemas/New/__test__/New.spec.tsx @@ -2,9 +2,12 @@ import React from 'react'; import New from 'components/Schemas/New/New'; import { render, WithRoute } from 'lib/testHelpers'; import { clusterSchemaNewPath } from 'lib/paths'; -import { screen } from '@testing-library/dom'; +import { act, screen } from '@testing-library/react'; +import userEvent from '@testing-library/user-event'; const clusterName = 'local'; +const subjectValue = 'subject'; +const schemaValue = 'schema'; describe('New Component', () => { beforeEach(() => { @@ -21,4 +24,26 @@ describe('New Component', () => { it('renders component', () => { expect(screen.getByText('Create new schema')).toBeInTheDocument(); }); + it('submit button will be disabled while form fields are not filled', () => { + const submitBtn = screen.getByRole('button', { name: /submit/i }); + expect(submitBtn).toBeDisabled(); + }); + it('submit button will be enabled when form fields are filled', async () => { + const subject = screen.getByPlaceholderText('Schema Name'); + const schema = screen.getAllByRole('textbox')[1]; + const schemaTypeSelect = screen.getByRole('listbox'); + + await act(() => { + userEvent.type(subject, subjectValue); + }); + await act(() => { + userEvent.type(schema, schemaValue); + }); + await act(() => { + userEvent.selectOptions(schemaTypeSelect, ['AVRO']); + }); + + const submitBtn = screen.getByRole('button', { name: /Submit/i }); + expect(submitBtn).toBeEnabled(); + }); });
null
train
train
2022-07-25T13:16:00
"2022-05-26T11:46:32Z"
armenuikafka
train
provectus/kafka-ui/2317_2318
provectus/kafka-ui
provectus/kafka-ui/2317
provectus/kafka-ui/2318
[ "connected", "timestamp(timedelta=1.0, similarity=0.8883881146328226)" ]
48325bc5adf446cf14763d298d19202c83d67e01
bff27f1b5bfc7ab1af339e829f0905df57f206df
[ "Hello there joseacl! πŸ‘‹\n\nThank you and congratulations πŸŽ‰ for opening your very first issue in this project! πŸ’–\n\nIn case you want to claim this issue, please comment down below! We will try to get back to you as soon as we can. πŸ‘€", "Hi @Haarolean @germanosin.\r\n\r\nI've opened that issue because we're rolling out this tool in our clusters and due to the fact that we have clusters with different Kubernetes versions, we've realized the error described in the issue.\r\n\r\nCurrently, we're overriding the ingress template, so I've decided to create a PR https://github.com/provectus/kafka-ui/pull/2318 because it can happen to other users.\r\nLooking forward to hearing from you, \r\n\r\nThanks a lot!", "Hey @joseacl, thanks for raising the issue, we'll take a look at the PR :)" ]
[]
"2022-07-22T08:05:14Z"
[ "type/bug", "status/accepted", "scope/k8s" ]
Make Ingress resource compatible with kubernetes versions lower than 1.19
**Describe the bug** The helm template file for ingress is checking that **APIVersions** contains **networking.k8s.io/v1**, but this capability is available in Kubernetes since the version v1.8 **but for NetworkPolicy resource and not for Ingress**. https://kubernetes.io/docs/reference/using-api/deprecation-guide/#networkpolicy-v116 So if you applied helm template with **--kube-version 1.18** for example, the ingress is going to be rendered using **apiVersion: networking.k8s.io/v1** which is not supported by this version of Kubernetes https://kubernetes.io/docs/reference/using-api/deprecation-guide/#ingress-v122 **Set up** Using helm version Version:"v3.8.0" with has --kube-version flags available. Clone the repository and go to charts/kafka-ui **Steps to Reproduce** Steps to reproduce the behavior: 1. Execute the following command `helm template --kube-version 1.18 --set ingress.enabled=true --output-dir ./manifests .` **Expected behavior** Ingress resource rendered should be compatible with kubernetes version specified, 1.18. Something like this ``` apiVersion: networking.k8s.io/v1beta1 kind: Ingress metadata: name: release-name-kafka-ui labels: helm.sh/chart: kafka-ui-0.0.4 app.kubernetes.io/name: kafka-ui app.kubernetes.io/instance: release-name app.kubernetes.io/version: "latest" app.kubernetes.io/managed-by: Helm spec: rules: - http: paths: - backend: serviceName: release-name-kafka-ui servicePort: 80 ``` But with the current template, the ingress is incompatible with versions lower than 1.19 ``` apiVersion: networking.k8s.io/v1 kind: Ingress metadata: name: release-name-kafka-ui labels: helm.sh/chart: kafka-ui-0.0.4 app.kubernetes.io/name: kafka-ui app.kubernetes.io/instance: release-name app.kubernetes.io/version: "latest" app.kubernetes.io/managed-by: Helm spec: rules: - http: paths: - backend: service: name: release-name-kafka-ui port: number: 80 pathType: Prefix ```
[ ".github/workflows/helm.yaml", "charts/kafka-ui/templates/ingress.yaml" ]
[ ".github/workflows/helm.yaml", "charts/kafka-ui/templates/ingress.yaml" ]
[]
diff --git a/.github/workflows/helm.yaml b/.github/workflows/helm.yaml index 664a15e8a73..0da4bdc51e4 100644 --- a/.github/workflows/helm.yaml +++ b/.github/workflows/helm.yaml @@ -22,11 +22,11 @@ jobs: shell: bash run: | sed -i "s@enabled: false@enabled: true@g" charts/kafka-ui/values.yaml - K8S_VERSIONS=$(git ls-remote --refs --tags https://github.com/kubernetes/kubernetes.git | cut -d/ -f3 | grep -e '^v1\.[0-9]\{2\}\.[0]\{1,2\}$' | grep -v -e '^v1\.1[0-8]\{1\}' | cut -c2-) + K8S_VERSIONS=$(git ls-remote --refs --tags https://github.com/kubernetes/kubernetes.git | cut -d/ -f3 | grep -e '^v1\.[0-9]\{2\}\.[0]\{1,2\}$' | grep -v -e '^v1\.1[0-7]\{1\}' | cut -c2-) echo "NEXT K8S VERSIONS ARE GOING TO BE TESTED: $K8S_VERSIONS" echo "" for version in $K8S_VERSIONS do echo $version; - helm template charts/kafka-ui -f charts/kafka-ui/values.yaml | kubeval --additional-schema-locations https://raw.githubusercontent.com/yannh/kubernetes-json-schema/master --strict -v $version; + helm template --kube-version $version --set ingress.enabled=true charts/kafka-ui -f charts/kafka-ui/values.yaml | kubeval --additional-schema-locations https://raw.githubusercontent.com/yannh/kubernetes-json-schema/master --strict -v $version; done diff --git a/charts/kafka-ui/templates/ingress.yaml b/charts/kafka-ui/templates/ingress.yaml index 8631ea0130c..7659867b31a 100644 --- a/charts/kafka-ui/templates/ingress.yaml +++ b/charts/kafka-ui/templates/ingress.yaml @@ -1,7 +1,7 @@ {{- if .Values.ingress.enabled -}} {{- $fullName := include "kafka-ui.fullname" . -}} {{- $svcPort := .Values.service.port -}} -{{- if $.Capabilities.APIVersions.Has "networking.k8s.io/v1" }} +{{- if and ($.Capabilities.APIVersions.Has "networking.k8s.io/v1") (trimPrefix "v" .Capabilities.KubeVersion.Version | semverCompare ">= 1.19" ) -}} apiVersion: networking.k8s.io/v1 {{- else if $.Capabilities.APIVersions.Has "networking.k8s.io/v1beta1" }} apiVersion: networking.k8s.io/v1beta1 @@ -30,7 +30,7 @@ spec: rules: - http: paths: -{{- if $.Capabilities.APIVersions.Has "networking.k8s.io/v1" -}} +{{- if and ($.Capabilities.APIVersions.Has "networking.k8s.io/v1") (trimPrefix "v" .Capabilities.KubeVersion.Version | semverCompare ">= 1.19" ) -}} {{- range .Values.ingress.precedingPaths }} - path: {{ .path }} pathType: Prefix
null
val
train
2022-07-21T14:57:49
"2022-07-22T06:53:51Z"
joseacl
train
provectus/kafka-ui/2169_2339
provectus/kafka-ui
provectus/kafka-ui/2169
provectus/kafka-ui/2339
[ "timestamp(timedelta=1.0, similarity=0.8648408167878369)", "connected" ]
bffe316063e6c68b9d48ee18cdb92d2974e7c58c
9827e01047ef7f0db2acc460d170fc1f284229ec
[ "Seems like that is a 80 symbols line delimiter. We can get rid of it." ]
[]
"2022-07-25T14:37:25Z"
[ "good first issue", "scope/frontend", "status/accepted", "type/chore" ]
Unnecessary line within KSQL DB / query
**Describe the bug** There is an Unnecessary line within KSQL DB / query **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** Steps to reproduce the behavior: 1. Navigate to KSQL DB 2. Press 'Execute KSQL Request' **Expected behavior** The line is not necessary Display: 3456 Γ— 2234 **Screenshots** <img width="1718" alt="line within ksql db" src="https://user-images.githubusercontent.com/104780608/173776414-5e0062e7-b804-4418-8ac2-66c1e414dcfa.png">
[ "kafka-ui-react-app/src/components/KsqlDb/Query/QueryForm/QueryForm.styled.ts" ]
[ "kafka-ui-react-app/src/components/KsqlDb/Query/QueryForm/QueryForm.styled.ts" ]
[]
diff --git a/kafka-ui-react-app/src/components/KsqlDb/Query/QueryForm/QueryForm.styled.ts b/kafka-ui-react-app/src/components/KsqlDb/Query/QueryForm/QueryForm.styled.ts index 1f5dcde061e..a8fa1cf7afa 100644 --- a/kafka-ui-react-app/src/components/KsqlDb/Query/QueryForm/QueryForm.styled.ts +++ b/kafka-ui-react-app/src/components/KsqlDb/Query/QueryForm/QueryForm.styled.ts @@ -65,11 +65,13 @@ export const Fieldset = styled.fieldset` export const SQLEditor = styled(BaseSQLEditor)( ({ readOnly, theme }) => - readOnly && css` - background: ${theme.ksqlDb.query.editor.readonly.background}; + background: ${readOnly && theme.ksqlDb.query.editor.readonly.background}; .ace-cursor { - ${theme.ksqlDb.query.editor.readonly.cursor} + ${readOnly && theme.ksqlDb.query.editor.readonly.cursor} + } + .ace_print-margin { + display: none; } ` );
null
test
train
2022-07-25T13:16:00
"2022-06-15T08:15:49Z"
armenuikafka
train
provectus/kafka-ui/1963_2357
provectus/kafka-ui
provectus/kafka-ui/1963
provectus/kafka-ui/2357
[ "connected", "timestamp(timedelta=2.0, similarity=1.0)" ]
b6e9e4386887035c0d0326cadbc03e3e48d935eb
27252393a290f32289aebd10c2d5b12e9d9d8500
[]
[]
"2022-07-28T17:37:25Z"
[ "scope/backend", "type/security", "status/accepted" ]
CVE fixes Q2.22
https://github.com/provectus/kafka-ui/runs/7354244079?check_suite_focus=true + upgrade alpine
[ "kafka-ui-api/Dockerfile", "pom.xml" ]
[ "kafka-ui-api/Dockerfile", "pom.xml" ]
[]
diff --git a/kafka-ui-api/Dockerfile b/kafka-ui-api/Dockerfile index 5488a771810..3990d488315 100644 --- a/kafka-ui-api/Dockerfile +++ b/kafka-ui-api/Dockerfile @@ -1,4 +1,4 @@ -FROM alpine:3.15.0 +FROM alpine:3.16.1 RUN apk add --no-cache openjdk13-jre libc6-compat gcompat \ && addgroup -S kafkaui && adduser -S kafkaui -G kafkaui diff --git a/pom.xml b/pom.xml index 2f307fdb5ad..c06d556132d 100644 --- a/pom.xml +++ b/pom.xml @@ -14,7 +14,7 @@ <maven.compiler.target>13</maven.compiler.target> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> - <spring-boot.version>2.6.7</spring-boot.version> + <spring-boot.version>2.6.8</spring-boot.version> <jackson-databind-nullable.version>0.2.2</jackson-databind-nullable.version> <org.mapstruct.version>1.4.2.Final</org.mapstruct.version> <org.projectlombok.version>1.18.20</org.projectlombok.version>
null
val
train
2022-08-02T16:32:13
"2022-05-12T14:10:19Z"
Haarolean
train
provectus/kafka-ui/1756_2360
provectus/kafka-ui
provectus/kafka-ui/1756
provectus/kafka-ui/2360
[ "connected", "timestamp(timedelta=1.0, similarity=0.9385010753275334)" ]
70414d227943e6ca81fefb04a5848689e59eb473
ba983657e663da01c19fc037a865a6babe357e2c
[ "Hi, thanks for reaching out.\r\n\r\nWhat's the point? Have you encountered any problem?", "I'm thinking about multiple features\r\n\r\n- Apache Kafka supports Java 17\r\n- Ability to restart a connector's tasks on a single call in Kafka Connect\r\n- Extend OffsetFetch requests to accept multiple group ids.\r\n", "So, you're using the app as API?\r\n\r\n1. Good point, noted.\r\n2. I feel like we might've done it within #1444 already. Is that right?\r\n3. Okay.\r\n\r\n", "#1717 ", "the feature to restart all tasks is buggy in 3.1.0 , let's wait for 3.1.1\r\n\r\n-> https://issues.apache.org/jira/browse/KAFKA-13719?src=confmacro" ]
[]
"2022-07-29T20:01:17Z"
[ "good first issue", "scope/backend", "status/accepted", "type/dependencies" ]
Update kafka-clients dependency to 3.1.0
current version is Kafka version: 2.8.0 there is now the version 3.1.0
[ "pom.xml" ]
[ "pom.xml" ]
[]
diff --git a/pom.xml b/pom.xml index e229c4959e2..2f307fdb5ad 100644 --- a/pom.xml +++ b/pom.xml @@ -20,7 +20,7 @@ <org.projectlombok.version>1.18.20</org.projectlombok.version> <org.projectlombok.e2e-checks.version>1.18.20</org.projectlombok.e2e-checks.version> <git.revision>latest</git.revision> - <kafka-clients.version>2.8.0</kafka-clients.version> + <kafka-clients.version>3.2.0</kafka-clients.version> <node.version>v16.15.0</node.version> <pnpm.version>v7.4.0</pnpm.version> <dockerfile-maven-plugin.version>1.4.10</dockerfile-maven-plugin.version>
null
train
train
2022-07-31T14:38:44
"2022-03-24T16:42:45Z"
raphaelauv
train
provectus/kafka-ui/2367_2368
provectus/kafka-ui
provectus/kafka-ui/2367
provectus/kafka-ui/2368
[ "timestamp(timedelta=1.0, similarity=0.9023984446348281)", "connected" ]
70414d227943e6ca81fefb04a5848689e59eb473
1c0b297a595a8f6e38a813cbdffd46b879220465
[ "@joseacl thank you. We'll take a look soon." ]
[]
"2022-08-01T10:34:32Z"
[ "type/enhancement", "status/accepted", "scope/k8s" ]
Add the possibility to not specify the image.registry value
**Describe the bug** It's not a bug but an improvement on how the docker image of the kafka-ui application is built which will allow us to have less values in our deployment. <!--(A clear and concise description of what the bug is.)--> We're using that application in several environments (dev, qa, prod, etc...) for each environment we have configured docker/containerd to use a default mirror. The fact that the image is using `.Values.image.registry` as mandatory is making that we need to define this value per environment, overcomplicating the maintance of these values. If we simply set the `.Values.image.registry` to an empty string, the deployment generated does not work **Set up** <!-- How do you run the app? Please provide as much info as possible: 1. App version (docker image version or check commit hash in the top left corner in UI) 2. Helm chart version, if you use one 3. Any IAAC configs We might close the issue without further explanation if you don't provide such information. --> **Steps to Reproduce** Steps to reproduce the behavior: 1. Set this default value: `image: registry: ""` 2. Execute `helm template .` **Expected behavior** Generate a valid image to be pulled: ` provectuslabs/kafka-ui:latest ` Right now the value generated is `/provectuslabs/kafka-ui:latest` which generate that error ` git pull /provectuslabs/kafka-ui:latest fatal: '/provectuslabs/kafka-ui:latest' does not appear to be a git repository fatal: Could not read from remote repository. `
[ "charts/kafka-ui/templates/_helpers.tpl", "charts/kafka-ui/templates/deployment.yaml" ]
[ "charts/kafka-ui/templates/_helpers.tpl", "charts/kafka-ui/templates/deployment.yaml" ]
[]
diff --git a/charts/kafka-ui/templates/_helpers.tpl b/charts/kafka-ui/templates/_helpers.tpl index 076c4886f80..510452d4cf3 100644 --- a/charts/kafka-ui/templates/_helpers.tpl +++ b/charts/kafka-ui/templates/_helpers.tpl @@ -61,3 +61,19 @@ Create the name of the service account to use {{- default "default" .Values.serviceAccount.name }} {{- end }} {{- end }} + + +{{/* +This allows us to check if the registry of the image is specified or not. +*/}} +{{- define "kafka-ui.imageName" -}} +{{- $registryName := .Values.image.registry -}} +{{- $repository := .Values.image.repository -}} +{{- $tag := .Values.image.tag | default .Chart.AppVersion -}} +{{- if $registryName }} +{{- printf "%s/%s:%s" $registryName $repository $tag -}} +{{- else }} +{{- printf "%s:%s" $repository $tag -}} +{{- end }} +{{- end -}} + diff --git a/charts/kafka-ui/templates/deployment.yaml b/charts/kafka-ui/templates/deployment.yaml index 22149bb7513..1f7f6c92ad4 100644 --- a/charts/kafka-ui/templates/deployment.yaml +++ b/charts/kafka-ui/templates/deployment.yaml @@ -40,7 +40,7 @@ spec: - name: {{ .Chart.Name }} securityContext: {{- toYaml .Values.securityContext | nindent 12 }} - image: "{{ .Values.image.registry }}/{{ .Values.image.repository }}:{{ .Values.image.tag | default .Chart.AppVersion }}" + image: {{ include "kafka-ui.imageName" . }} imagePullPolicy: {{ .Values.image.pullPolicy }} {{- if or .Values.env .Values.yamlApplicationConfig .Values.yamlApplicationConfigConfigMap}} env:
null
test
train
2022-07-31T14:38:44
"2022-08-01T10:27:52Z"
joseacl
train
provectus/kafka-ui/2243_2370
provectus/kafka-ui
provectus/kafka-ui/2243
provectus/kafka-ui/2370
[ "timestamp(timedelta=0.0, similarity=0.8766084907221564)", "connected" ]
7f74bf312a0e51ed1c4ba530a84e07e6d4dcae72
757bf9526efa1d66308ca26bc5cae2ef2f3bf343
[]
[]
"2022-08-01T15:28:16Z"
[ "type/enhancement", "good first issue", "scope/backend", "status/accepted" ]
Edit Topic: Validation message about "Replication Factor *" acceptable data
**Describe the bug** Could be better to have validation message about accepted value for Replication Factor * field **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** Steps to reproduce the behavior: 1. Navigate to Topics 2. Edit a Topic 3. Change the value of "Replication Factor * " field to be < = 0 **Expected behavior** Could be better to have information: Value should be > = 1 in a message instead of "400 Bad Request Fields validation failure" as it works for **Screenshots** <img width="1718" alt="replication factor decrease" src="https://user-images.githubusercontent.com/104780608/177123772-0890654a-cc05-48b2-858e-2f555835a84f.png">
[ "kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/TopicsService.java", "kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml" ]
[ "kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/TopicsService.java", "kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml" ]
[]
diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/TopicsService.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/TopicsService.java index 75c32984617..46ecedd0b3d 100644 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/TopicsService.java +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/TopicsService.java @@ -259,6 +259,11 @@ public Mono<ReplicationFactorChangeResponseDTO> changeReplicationFactor( new ValidationException( String.format("Topic already has replicationFactor %s.", actual))); } + if (requested <= 0) { + return Mono.error( + new ValidationException( + String.format("Requested replication factor (%s) should be greater or equal to 1.", requested))); + } if (requested > brokersCount) { return Mono.error( new ValidationException( diff --git a/kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml b/kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml index 63f313e2ac1..77817f0edb9 100644 --- a/kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml +++ b/kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml @@ -592,6 +592,8 @@ paths: $ref: '#/components/schemas/ReplicationFactorChangeResponse' 404: description: Not found + 400: + description: Bad Request /api/clusters/{clusterName}/topics/{topicName}/messages: get: @@ -3052,7 +3054,6 @@ components: properties: totalReplicationFactor: type: integer - minimum: 1 required: - totalReplicationFactor
null
val
train
2022-08-08T19:20:34
"2022-07-04T09:20:21Z"
armenuikafka
train
provectus/kafka-ui/2382_2387
provectus/kafka-ui
provectus/kafka-ui/2382
provectus/kafka-ui/2387
[ "connected" ]
38d795d5a0d57c3bb5de9ead6cd762671d06817b
9b3495a2e9e2c92b8a073bc4ed835a0c30b66b27
[ "Hello there sgolod! πŸ‘‹\n\nThank you and congratulations πŸŽ‰ for opening your very first issue in this project! πŸ’–\n\nIn case you want to claim this issue, please comment down below! We will try to get back to you as soon as we can. πŸ‘€", "Hey, thanks for reporting. We'll take a look.", "Hello @sgolod , thx for discovering & reporting this\r\ncould you please try to use helm chart from this [branch](https://github.com/provectus/kafka-ui/tree/issues/2382) \r\nI don't have any GKE available to make final tests.", "> \r\n\r\nYou forgot to make same changes below (line 35 in this branch).\r\nIMHO, but this patch is overcomplication. This can be done more elegantly by comparing with \"1.19.0-0\", this solution suggested in same original Helm issue:\r\nhttps://github.com/helm/helm/issues/3810", "Hi @sgolod \r\nyou comment about line 35 correct, I also found and made correction in my last PR\r\nThe line `1.19.0` will be automatically treated as `1.19.0-0[]` when converting to the full semver format. I agree that your proposal will work as well. I Prefer to use my version, because it could be corner case when some K8s produce specific version value which could not be comparaible just with `1.19.0-0`. " ]
[ "Quite significant version increase, is it a typo?", "I can't follow why use this construct instead of simply checking the presence of the API version?\r\n```\r\n{{- if $.Capabilities.APIVersions.Has \"networking.k8s.io/v1\" -}}\r\n```\r\nAFAIK k8s version is irrelevant.\r\n\r\nConsider following simple tester template:\r\n```\r\napiVersion: v1\r\nkind: ConfigMap\r\nmetadata:\r\n name: tester\r\ndata:\r\n{{- if $.Capabilities.APIVersions.Has \"networking.k8s.io/v1\" }}\r\n check_v1: \"Has this API\"\r\n{{- else }}\r\n check_v1: \"Does not have this API\"\r\n{{- end }}\r\n{{- if $.Capabilities.APIVersions.Has \"networking.k8s.io/v2\" }}\r\n check_v2: \"Has this API\"\r\n{{- else }}\r\n check_v2: \"Does not have this API\"\r\n{{- end }}\r\n```\r\nIf you try installing it with --dry-run it will be rendered as:\r\n```\r\napiVersion: v1\r\nkind: ConfigMap\r\nmetadata:\r\n name: tester\r\ndata:\r\n check_v1: \"Has this API\"\r\n check_v2: \"Does not have this API\"\r\n```\r\nWhich pretty much proves the point.", "No, currently the version of helm chart is not track at all. The release job replace chart version during build and push right version of Helm chart in Helm repo, but don't change anything in this repo.\r\n@Haarolean we need to decide, how we can solve this in future.", "Long story :)\r\nPlease review \r\nhttps://github.com/provectus/kafka-ui/issues/2317\r\nhttps://github.com/provectus/kafka-ui/pull/2318#issuecomment-1203893817\r\nYou could also check Kuberntes API reference\r\nhttps://github.com/instrumenta/kubernetes-json-schema/blob/master/v1.18.1/networkpolicy.json - networking.k8s.io/v1\r\nhttps://github.com/instrumenta/kubernetes-json-schema/blob/master/v1.18.1/ingress.json - networking.k8s.io/v1beta1\r\n\r\n\r\n\r\n\r\n", "@azatsafin \r\nGot it. If you will look closely at the output of `.Capabilities.APIVersions` internal, you'll notice that it returns a list of GVKs.\r\n\r\nTest stand:\r\n```YAML\r\napiVersion: v1\r\nkind: ConfigMap\r\nmetadata:\r\n name: tester\r\ndata:\r\n apiVersions: |\r\n {{ $.Capabilities.APIVersions }}\r\n```\r\n\r\nResult(cut to networking.k8s.io resources only):\r\n\r\n```\r\n[... networking.k8s.io/v1/NetworkPolicy networking.k8s.io/v1/Ingress networking.k8s.io/v1 networking.k8s.io/v1/IngressClass ...]\r\n```\r\nThis means you can be more specific in your checks, eg:\r\n\r\n```\r\n{{- if $.Capabilities.APIVersions.Has \"networking.k8s.io/v1/Ingress\" }}\r\n```\r\n\r\nI believe this will fix the issue in a more straightforward manner.", "This is a good idea, but please take into account that `helm template` command do not provide resource names for apiVersions, only 'helm install --dry-run' will work for it. \r\nAnd please have a look [here](https://github.com/provectus/kafka-ui/blob/98257b2b5fd77a0141f23bebb2b903d90d0a3113/.github/workflows/helm.yaml#L31) \r\nThere are no option to write any logic which will not look to the `apiVersion/resourceName` values and produce right template for ingress/networkPolicy. \r\n\r\nExample:\r\n`helm template abc abc`\r\n``` \r\napiVersion: v1\r\nkind: ConfigMap\r\nmetadata:\r\n name: tester\r\ndata:\r\n apiVersions: |\r\n [v1 admissionregistration.k8s.io/v1 admissionregistration.k8s.io/v1beta1 internal.apiserver.k8s.io/v1alpha1 apps/v1 apps/v1beta1 apps/v1beta2 authentication.k8s.io/v1 authentication.k8s.io/v1beta1 authorization.k8s.io/v1 authorization.k8s.io/v1beta1 autoscaling/v1 autoscaling/v2 autoscaling/v2beta1 autoscaling/v2beta2 batch/v1 batch/v1beta1 certificates.k8s.io/v1 certificates.k8s.io/v1beta1 coordination.k8s.io/v1beta1 coordination.k8s.io/v1 discovery.k8s.io/v1 discovery.k8s.io/v1beta1 events.k8s.io/v1 events.k8s.io/v1beta1 extensions/v1beta1 flowcontrol.apiserver.k8s.io/v1alpha1 flowcontrol.apiserver.k8s.io/v1beta1 flowcontrol.apiserver.k8s.io/v1beta2 networking.k8s.io/v1 networking.k8s.io/v1beta1 node.k8s.io/v1 node.k8s.io/v1alpha1 node.k8s.io/v1beta1 policy/v1 policy/v1beta1 rbac.authorization.k8s.io/v1 rbac.authorization.k8s.io/v1beta1 rbac.authorization.k8s.io/v1alpha1 scheduling.k8s.io/v1alpha1 scheduling.k8s.io/v1beta1 scheduling.k8s.io/v1 storage.k8s.io/v1beta1 storage.k8s.io/v1 storage.k8s.io/v1alpha1 apiextensions.k8s.io/v1beta1 apiextensions.k8s.io/v1]\r\n```\r\n\r\n`helm install --dry-run abc abc`\r\n```\r\napiVersion: v1\r\nkind: ConfigMap\r\nmetadata:\r\n name: tester\r\ndata:\r\n apiVersions: |\r\n [apps/v1 storage.k8s.io/v1beta1 v1/Secret apps/v1/ControllerRevision apps/v1/Scale events.k8s.io/v1beta1/Event policy/v1beta1/PodSecurityPolicy batch/v1 v1/Node batch/v1beta1/CronJob argoproj.io/v1alpha1/ArgoCDExtension v1/Eviction networking.k8s.io/v1/IngressClass scheduling.k8s.io/v1/PriorityClass node.k8s.io/v1/RuntimeClass apiextensions.k8s.io/v1 node.k8s.io/v1beta1 apps/v1/StatefulSet rbac.authorization.k8s.io/v1/ClusterRole metrics.k8s.io/v1beta1/NodeMetrics admissionregistration.k8s.io/v1 authorization.k8s.io/v1beta1/SubjectAccessReview authorization.k8s.io/v1beta1 rbac.authorization.k8s.io/v1beta1 autoscaling/v2beta2/HorizontalPodAutoscaler apiregistration.k8s.io/v1 scheduling.k8s.io/v1beta1 crd.k8s.amazonaws.com/v1alpha1 v1/ResourceQuota admissionregistration.k8s.io/v1/ValidatingWebhookConfiguration coordination.k8s.io/v1 authorization.k8s.io/v1beta1/SelfSubjectRulesReview networking.k8s.io/v1beta1/Ingress policy/v1/PodDisruptionBudget certificates.k8s.io/v1/CertificateSigningRequest networking.k8s.io/v1/Ingress rbac.authorization.k8s.io/v1beta1/ClusterRole storage.k8s.io/v1/CSIDriver flowcontrol.apiserver.k8s.io/v1beta1/FlowSchema crd.k8s.amazonaws.com/v1alpha1/ENIConfig events.k8s.io/v1 certificates.k8s.io/v1beta1 apiregistration.k8s.io/v1beta1/APIService authorization.k8s.io/v1beta1/LocalSubjectAccessReview autoscaling/v2beta1/HorizontalPodAutoscaler scheduling.k8s.io/v1beta1/PriorityClass admissionregistration.k8s.io/v1beta1 discovery.k8s.io/v1beta1 v1/PodAttachOptions authorization.k8s.io/v1/SubjectAccessReview admissionregistration.k8s.io/v1beta1/MutatingWebhookConfiguration v1/Scale apiextensions.k8s.io/v1beta1/CustomResourceDefinition v1/Binding v1/NodeProxyOptions v1/PodProxyOptions apps/v1/Deployment networking.k8s.io/v1/NetworkPolicy storage.k8s.io/v1/VolumeAttachment storage.k8s.io/v1beta1/CSIDriver metrics.k8s.io/v1beta1/PodMetrics events.k8s.io/v1/Event apps/v1/DaemonSet apps/v1/ReplicaSet authorization.k8s.io/v1beta1/SelfSubjectAccessReview rbac.authorization.k8s.io/v1beta1/ClusterRoleBinding storage.k8s.io/v1beta1/VolumeAttachment coordination.k8s.io/v1beta1/Lease events.k8s.io/v1beta1 autoscaling/v2beta1 autoscaling/v2beta2 coordination.k8s.io/v1beta1 authorization.k8s.io/v1/SelfSubjectAccessReview rbac.authorization.k8s.io/v1/ClusterRoleBinding vpcresources.k8s.aws/v1beta1/SecurityGroupPolicy batch/v1beta1 v1/ComponentStatus v1/PodExecOptions discovery.k8s.io/v1/EndpointSlice authorization.k8s.io/v1 apiextensions.k8s.io/v1beta1 discovery.k8s.io/v1 policy/v1beta1/PodDisruptionBudget v1 certificates.k8s.io/v1 policy/v1 metrics.k8s.io/v1beta1 apiextensions.k8s.io/v1/CustomResourceDefinition apiregistration.k8s.io/v1beta1 v1/LimitRange v1/PodPortForwardOptions authorization.k8s.io/v1/SelfSubjectRulesReview batch/v1/Job autoscaling/v1 v1/ServiceAccount rbac.authorization.k8s.io/v1beta1/Role admissionregistration.k8s.io/v1/MutatingWebhookConfiguration admissionregistration.k8s.io/v1beta1/ValidatingWebhookConfiguration argoproj.io/v1alpha1/AppProject authorization.k8s.io/v1/LocalSubjectAccessReview autoscaling/v1/HorizontalPodAutoscaler authentication.k8s.io/v1 v1/ConfigMap networking.k8s.io/v1beta1/IngressClass storage.k8s.io/v1beta1/StorageClass coordination.k8s.io/v1/Lease node.k8s.io/v1beta1/RuntimeClass networking.k8s.io/v1 node.k8s.io/v1 flowcontrol.apiserver.k8s.io/v1beta1 v1/Service flowcontrol.apiserver.k8s.io/v1beta1/PriorityLevelConfiguration authentication.k8s.io/v1beta1 storage.k8s.io/v1 scheduling.k8s.io/v1 vpcresources.k8s.aws/v1beta1 v1/Endpoints v1/Pod certificates.k8s.io/v1beta1/CertificateSigningRequest rbac.authorization.k8s.io/v1/RoleBinding storage.k8s.io/v1/CSINode argoproj.io/v1alpha1/Application extensions/v1beta1 v1/Namespace extensions/v1beta1/Ingress networking.k8s.io/v1beta1 rbac.authorization.k8s.io/v1 v1/TokenRequest v1/ServiceProxyOptions storage.k8s.io/v1beta1/CSINode v1/PersistentVolume v1/ReplicationController rbac.authorization.k8s.io/v1/Role rbac.authorization.k8s.io/v1beta1/RoleBinding storage.k8s.io/v1/StorageClass argoproj.io/v1alpha1 batch/v1/CronJob discovery.k8s.io/v1beta1/EndpointSlice policy/v1beta1 v1/Event v1/PersistentVolumeClaim v1/PodTemplate apiregistration.k8s.io/v1/APIService authentication.k8s.io/v1/TokenReview authentication.k8s.io/v1beta1/TokenReview storage.k8s.io/v1beta1/CSIStorageCapacity]\r\n```\r\nLast command could be executed only for specific kubernetes context, which are not available for testing. \r\n\r\n\r\n", "You are right, `template` will not provide the correct apiversions but it won't provide .Capabilities.KubeVersion.Version as well for the same reason. In fact detected version will be determined with your helm version. Just did 2 checks:\r\n- with `helm v3.3.0` .Capabilities.KubeVersion.Version == 1.18.0\r\n- with `helm v3.9.2` .Capabilities.KubeVersion.Version == 1.24.0\r\n\r\nEither way it's fair because this is a documented behaviour of `helm template`.\r\n\r\nIf you you are not convinced - let me know and I'll resolve this conversation.", "I agree that we should track version of used HELM, and supported kubernetes versions. @Haarolean \r\n\r\nI don't want to link resource definitions to use HELM versions, because one HELM version support at least 3 Kubernetes version, we can find it [here](https://helm.sh/docs/topics/version_skew/) .\r\n\r\nCurrently, in our test [pipeline](https://github.com/provectus/kafka-ui/blob/757bf9526efa1d66308ca26bc5cae2ef2f3bf343/.github/workflows/helm.yaml#L31), we are using `--kube-version` argument to provide Kubernetes vesions to Charts, I don't want to change that logic now. " ]
"2022-08-03T15:44:34Z"
[ "type/bug", "status/accepted", "scope/k8s" ]
Latest Ingress changes breaks compatibility with GKE clusters versions (GKE use -gke.XXXX as suffix for version)
This changes breaks compatibility with GKE clusters versions (GKE use -gke.XXXX as suffix for version). It can be solved when use semver comparision to full version ("1.19" -> "1.19.0-0") ``` (trimPrefix "v" .Capabilities.KubeVersion.Version | semverCompare ">= 1.19" ) => (trimPrefix "v" .Capabilities.KubeVersion.Version | semverCompare ">= 1.19.0-0" ) ``` _Originally posted by @sgolod in https://github.com/provectus/kafka-ui/issues/2318#issuecomment-1203893817_
[ "charts/kafka-ui/Chart.yaml", "charts/kafka-ui/templates/ingress.yaml" ]
[ "charts/kafka-ui/Chart.yaml", "charts/kafka-ui/templates/ingress.yaml" ]
[]
diff --git a/charts/kafka-ui/Chart.yaml b/charts/kafka-ui/Chart.yaml index 1177e74887c..7eaff3c59aa 100644 --- a/charts/kafka-ui/Chart.yaml +++ b/charts/kafka-ui/Chart.yaml @@ -2,6 +2,6 @@ apiVersion: v2 name: kafka-ui description: A Helm chart for kafka-UI type: application -version: 0.0.4 +version: 0.4.2 appVersion: latest icon: https://github.com/provectus/kafka-ui/raw/master/documentation/images/kafka-ui-logo.png diff --git a/charts/kafka-ui/templates/ingress.yaml b/charts/kafka-ui/templates/ingress.yaml index 7659867b31a..e4b33439c42 100644 --- a/charts/kafka-ui/templates/ingress.yaml +++ b/charts/kafka-ui/templates/ingress.yaml @@ -1,7 +1,9 @@ {{- if .Values.ingress.enabled -}} {{- $fullName := include "kafka-ui.fullname" . -}} {{- $svcPort := .Values.service.port -}} -{{- if and ($.Capabilities.APIVersions.Has "networking.k8s.io/v1") (trimPrefix "v" .Capabilities.KubeVersion.Version | semverCompare ">= 1.19" ) -}} +{{- $kubeCapabilityVersion := semver .Capabilities.KubeVersion.Version -}} +{{- $isHigher1p19 := ge (semver "1.19" | $kubeCapabilityVersion.Compare) 0 -}} +{{- if and ($.Capabilities.APIVersions.Has "networking.k8s.io/v1") $isHigher1p19 -}} apiVersion: networking.k8s.io/v1 {{- else if $.Capabilities.APIVersions.Has "networking.k8s.io/v1beta1" }} apiVersion: networking.k8s.io/v1beta1 @@ -30,7 +32,7 @@ spec: rules: - http: paths: -{{- if and ($.Capabilities.APIVersions.Has "networking.k8s.io/v1") (trimPrefix "v" .Capabilities.KubeVersion.Version | semverCompare ">= 1.19" ) -}} +{{- if and ($.Capabilities.APIVersions.Has "networking.k8s.io/v1") $isHigher1p19 -}} {{- range .Values.ingress.precedingPaths }} - path: {{ .path }} pathType: Prefix
null
train
train
2022-08-03T14:54:20
"2022-08-03T12:37:47Z"
sgolod
train
provectus/kafka-ui/2262_2390
provectus/kafka-ui
provectus/kafka-ui/2262
provectus/kafka-ui/2390
[ "keyword_pr_to_issue", "connected" ]
8f0ffe665c53429f4d9618adafd31663b50ddd3f
98257b2b5fd77a0141f23bebb2b903d90d0a3113
[]
[]
"2022-08-03T21:27:55Z"
[ "type/bug", "good first issue", "scope/frontend", "status/accepted", "status/confirmed" ]
Keep the limitation error messages the same for all numerical fields in a Topic profile
**Describe the bug** Could be better to have the same error message about limitation for all number fields within Topic profile **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** 1. Navigate to Topics 2. Add a topic 3. Fill not accepted value for "Number of partitions *", "Replication Factor *", "Min In Sync Replicas *" fields (value not in -2147483648 to 2147483647) **Expected behavior** The error messages should be the same for all the fields and contain information about acceptable value **Screenshots** <img width="1680" alt="error message limitation 2" src="https://user-images.githubusercontent.com/104780608/178687024-f7f1bcd7-135b-4bfd-9b4f-4c45ce714b28.png"> <img width="1680" alt="error message limitati8on 1" src="https://user-images.githubusercontent.com/104780608/178687040-961ed960-cef5-4254-8c34-5a9369a697e7.png"> **Additional context** <!-- (Add any other context about the problem here) -->
[ "kafka-ui-react-app/src/lib/yupExtended.ts" ]
[ "kafka-ui-react-app/src/lib/yupExtended.ts" ]
[]
diff --git a/kafka-ui-react-app/src/lib/yupExtended.ts b/kafka-ui-react-app/src/lib/yupExtended.ts index 967bc0bbafe..ca51e662f15 100644 --- a/kafka-ui-react-app/src/lib/yupExtended.ts +++ b/kafka-ui-react-app/src/lib/yupExtended.ts @@ -55,16 +55,19 @@ export const topicFormValidationSchema = yup.object().shape({ partitions: yup .number() .min(1) + .max(2147483647) .required() .typeError('Number of partitions is required and must be a number'), replicationFactor: yup .number() .min(1) + .max(2147483647) .required() .typeError('Replication factor is required and must be a number'), minInSyncReplicas: yup .number() .min(1) + .max(2147483647) .required() .typeError('Min in sync replicas is required and must be a number'), cleanupPolicy: yup.string().required(),
null
train
train
2022-08-04T23:25:50
"2022-07-13T08:25:56Z"
armenuikafka
train
provectus/kafka-ui/1622_2398
provectus/kafka-ui
provectus/kafka-ui/1622
provectus/kafka-ui/2398
[ "keyword_pr_to_issue" ]
98257b2b5fd77a0141f23bebb2b903d90d0a3113
122f90fbb25c3b20e461246c0a50aeb08ef51459
[]
[]
"2022-08-04T19:23:19Z"
[ "scope/backend", "type/chore" ]
Support JVM hot swap for `spring boot:run`
Probably missing spring-devtools at least
[ "kafka-ui-api/pom.xml" ]
[ "kafka-ui-api/pom.xml" ]
[]
diff --git a/kafka-ui-api/pom.xml b/kafka-ui-api/pom.xml index a6ff458c6c5..53eb24731d7 100644 --- a/kafka-ui-api/pom.xml +++ b/kafka-ui-api/pom.xml @@ -217,6 +217,11 @@ <artifactId>datasketches-java</artifactId> <version>${datasketches-java.version}</version> </dependency> + <dependency> + <groupId>org.springframework.boot</groupId> + <artifactId>spring-boot-devtools</artifactId> + <optional>true</optional> + </dependency> </dependencies>
null
train
train
2022-08-05T17:13:28
"2022-02-17T16:19:10Z"
Haarolean
train
provectus/kafka-ui/2375_2400
provectus/kafka-ui
provectus/kafka-ui/2375
provectus/kafka-ui/2400
[ "keyword_pr_to_issue", "timestamp(timedelta=1.0, similarity=0.9008447037262984)", "connected" ]
95a030614337529ff3dd578bb03fe1de18e5da7e
26d800f997ab32b408d0e307370985ebe2b9cf5c
[ "Hello there as93717913! πŸ‘‹\n\nThank you and congratulations πŸŽ‰ for opening your very first issue in this project! πŸ’–\n\nIn case you want to claim this issue, please comment down below! We will try to get back to you as soon as we can. πŸ‘€", "1. Fix character escaping for the filter query\r\n2. New line symbols should be present as well", "Is there a work around for this at this moment?", "> Is there a work around for this at this moment?\r\n\r\nunfortunately, no. But we're planning to fix this one soon." ]
[]
"2022-08-05T02:46:03Z"
[ "type/bug", "good first issue", "scope/frontend", "status/accepted", "status/confirmed" ]
Smart filters multiple conditions not working
<!-- Don't forget to check for existing issues/discussions regarding your proposal. We might already have it. https://github.com/provectus/kafka-ui/issues https://github.com/provectus/kafka-ui/discussions --> **Describe the bug** <!--(A clear and concise description of what the bug is.)--> For Smart Filter, it will work for only one condition, like `value.eventData.dutyDepartment == "VQA"` but when their comes multiple condition, second, third and other condition wouldn't work properly like `value.eventData.dutyDepartment == "VQA" && value.eventData.moSapPlant =="123456"` The second part `value.eventData.moSapPlant` will be ignored, the result will only match dutyDepartement condition **Set up** <!-- How do you run the app? Please provide as much info as possible: 1. App version (docker image version or check commit hash in the top left corner in UI) 2. Helm chart version, if you use one 3. Any IAAC configs We might close the issue without further explanation if you don't provide such information. --> Docker **Steps to Reproduce** <!-- We'd like you to provide an example setup (via docker-compose, helm, etc.) to reproduce the problem, especially with a complex setups. --> Steps to reproduce the behavior: 1. **Expected behavior** <!-- (A clear and concise description of what you expected to happen) --> **Screenshots** <!-- (If applicable, add screenshots to help explain your problem) --> one condition ![image](https://user-images.githubusercontent.com/16937091/182339375-b438bb9f-7154-40e1-a230-e4c4e33868ea.png) result ![image](https://user-images.githubusercontent.com/16937091/182339521-98939bba-b4a1-4109-86a1-df82b43c3b0f.png) two condition ![image](https://user-images.githubusercontent.com/16937091/182339714-f72cd911-a3d1-4cc4-8ea5-d27840701db9.png) result ![image](https://user-images.githubusercontent.com/16937091/182339788-72367bc2-f727-4924-b680-2353e7489dff.png) change condition step ![image](https://user-images.githubusercontent.com/16937091/182340026-3e1685aa-6f34-476e-91a7-741710eaa46a.png) result ![image](https://user-images.githubusercontent.com/16937091/182340139-e2884c31-f661-49ca-b1a3-7a9e99092d16.png) The result only will match on first condition
[ "kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/Filters.tsx" ]
[ "kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/Filters.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/Filters.tsx b/kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/Filters.tsx index 7acf873d6d9..018a8ef1fff 100644 --- a/kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/Filters.tsx +++ b/kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/Filters.tsx @@ -223,7 +223,7 @@ const Filters: React.FC<FiltersProps> = ({ const newProps = omitBy(props, (v) => v === undefined || v === ''); const qs = Object.keys(newProps) - .map((key) => `${key}=${newProps[key]}`) + .map((key) => `${key}=${encodeURIComponent(newProps[key] as string)}`) .join('&'); navigate({
null
val
train
2022-08-18T12:28:51
"2022-08-02T09:20:59Z"
as93717913
train
provectus/kafka-ui/2322_2410
provectus/kafka-ui
provectus/kafka-ui/2322
provectus/kafka-ui/2410
[ "keyword_pr_to_issue", "timestamp(timedelta=1.0, similarity=1.0000000000000002)" ]
b1a13e442b1a729aa3ba157c681deda59cd335dd
7f74bf312a0e51ed1c4ba530a84e07e6d4dcae72
[ "Hey @Haarolean , I'm interested in contributing to this issue, so before I start working on it, would you mind sparing your time explaining what the issue is about and pointing me to some resources to get started.", "@9gl hey, the idea is to not display \"none\" if there's no error :) But it turns out it's a backend issue. If you wanna tackle any other frontend-related, you can check out [up for grabs](https://github.com/provectus/kafka-ui/projects/11#card-84437819) board." ]
[]
"2022-08-05T20:16:53Z"
[ "type/enhancement", "good first issue", "scope/backend", "status/accepted" ]
Do not display broker error as "none"
<img width="805" alt="image" src="https://user-images.githubusercontent.com/1494347/180452842-d2aa5e83-c2d4-4cad-a735-f87a2647fb46.png">
[ "kafka-ui-api/src/main/java/com/provectus/kafka/ui/mapper/DescribeLogDirsMapper.java" ]
[ "kafka-ui-api/src/main/java/com/provectus/kafka/ui/mapper/DescribeLogDirsMapper.java" ]
[]
diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/mapper/DescribeLogDirsMapper.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/mapper/DescribeLogDirsMapper.java index c7e66d0f455..3d84aa3ad90 100644 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/mapper/DescribeLogDirsMapper.java +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/mapper/DescribeLogDirsMapper.java @@ -8,6 +8,7 @@ import java.util.Map; import java.util.stream.Collectors; import org.apache.kafka.common.TopicPartition; +import org.apache.kafka.common.protocol.Errors; import org.apache.kafka.common.requests.DescribeLogDirsResponse; import org.springframework.stereotype.Component; @@ -28,7 +29,7 @@ private BrokersLogdirsDTO toBrokerLogDirs(Integer broker, String dirName, DescribeLogDirsResponse.LogDirInfo logDirInfo) { BrokersLogdirsDTO result = new BrokersLogdirsDTO(); result.setName(dirName); - if (logDirInfo.error != null) { + if (logDirInfo.error != null && logDirInfo.error != Errors.NONE) { result.setError(logDirInfo.error.message()); } var topics = logDirInfo.replicaInfos.entrySet().stream()
null
train
train
2022-08-08T15:04:29
"2022-07-22T13:48:08Z"
Haarolean
train
provectus/kafka-ui/2411_2412
provectus/kafka-ui
provectus/kafka-ui/2411
provectus/kafka-ui/2412
[ "connected", "timestamp(timedelta=0.0, similarity=0.8483214303572048)" ]
32b550f6710f109225d27c1d68ed6d65ce191eae
5beb6168b9d009e268374907c1e9bf0818c6ba1d
[ "Hello there occidere! πŸ‘‹\n\nThank you and congratulations πŸŽ‰ for opening your very first issue in this project! πŸ’–\n\nIn case you want to claim this issue, please comment down below! We will try to get back to you as soon as we can. πŸ‘€" ]
[]
"2022-08-06T07:44:35Z"
[ "type/documentation" ]
SECURE_BROKER guide link in README.md is not working
<!-- Don't forget to check for existing issues/discussions regarding your proposal. We might already have it. https://github.com/provectus/kafka-ui/issues https://github.com/provectus/kafka-ui/discussions --> **Describe the bug** <!--(A clear and concise description of what the bug is.)--> - SECURE_BROKER guide link in README.md is not working - https://github.com/provectus/kafka-ui/blob/master/README.md?plain=1#L77 - SECURE_BROKER file looks like moved to [this](https://github.com/provectus/kafka-ui/blob/master/documentation/guides/SECURE_BROKER.md) directory.
[ "README.md" ]
[ "README.md" ]
[]
diff --git a/README.md b/README.md index d3efd90c543..38e885c1640 100644 --- a/README.md +++ b/README.md @@ -74,7 +74,7 @@ We have plenty of [docker-compose files](documentation/compose/DOCKER_COMPOSE.md - [SSO configuration](documentation/guides/SSO.md) - [AWS IAM configuration](documentation/guides/AWS_IAM.md) - [Docker-compose files](documentation/compose/DOCKER_COMPOSE.md) -- [Connection to a secure broker](documentation/compose/SECURE_BROKER.md) +- [Connection to a secure broker](documentation/guides/SECURE_BROKER.md) ### Configuration File Example of how to configure clusters in the [application-local.yml](https://github.com/provectus/kafka-ui/blob/master/kafka-ui-api/src/main/resources/application-local.yml) configuration file:
null
test
train
2022-08-05T21:15:48
"2022-08-06T07:30:08Z"
occidere
train
provectus/kafka-ui/2373_2416
provectus/kafka-ui
provectus/kafka-ui/2373
provectus/kafka-ui/2416
[ "timestamp(timedelta=1.0, similarity=0.9689114746312342)", "connected" ]
7765a268af412e0045f6815cc024aaa7567519a8
ffb62d3eabf2484cc90a036f3a1782bfcc1cddf1
[]
[ "and again, that's a wrong test" ]
"2022-08-08T11:53:18Z"
[ "type/bug", "scope/QA", "status/accepted", "status/confirmed" ]
FIx unstable e2e test
it shouldn't depend on number of topics in any case ![image](https://user-images.githubusercontent.com/1494347/182204890-692395bf-0101-4ff8-8cc9-622179f470e8.png)
[ "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicsList.java" ]
[ "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicsList.java" ]
[ "kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/TopicTests.java" ]
diff --git a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicsList.java b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicsList.java index cfd120f3a01..5b816d05f5c 100644 --- a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicsList.java +++ b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicsList.java @@ -55,7 +55,7 @@ public TopicView openTopic(String topicName) { @SneakyThrows public TopicsList isTopicNotVisible(String topicName) { $$x("//table/tbody/tr/td[2]") - .shouldBe(CollectionCondition.sizeGreaterThanOrEqual(4)) + .shouldBe(CollectionCondition.sizeGreaterThan(0)) .find(Condition.exactText(topicName)) .shouldBe(Condition.not(Condition.visible)); return this;
diff --git a/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/TopicTests.java b/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/TopicTests.java index a962b8ceabb..75e5cfa7224 100644 --- a/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/TopicTests.java +++ b/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/TopicTests.java @@ -64,7 +64,6 @@ public void createTopic() { .goToSideMenu(SECOND_LOCAL, MainPage.SideMenuOptions.TOPICS) .topicIsNotVisible(NEW_TOPIC); } - @Disabled("Due to issue https://github.com/provectus/kafka-ui/issues/1500 ignore this test") @SneakyThrows @DisplayName("should update a topic") @@ -104,7 +103,6 @@ public void updateTopic() { @AutomationStatus(status = Status.AUTOMATED) @CaseId(207) @Test - @Disabled // TODO: https://github.com/provectus/kafka-ui/issues/2373 public void deleteTopic() { pages.openTopicsList(SECOND_LOCAL) .isOnPage()
train
train
2022-08-09T15:13:03
"2022-08-01T17:10:56Z"
Haarolean
train
provectus/kafka-ui/2295_2418
provectus/kafka-ui
provectus/kafka-ui/2295
provectus/kafka-ui/2418
[ "timestamp(timedelta=1.0, similarity=0.9155516629684449)", "connected" ]
ffb62d3eabf2484cc90a036f3a1782bfcc1cddf1
d781ac45da47afcd2c2d782f8d1ca06a696468c7
[ "@Haarolean as we agreed, all those methods containing just 'SelenideElement.shouldBe(Condition)' are needed\r\nbut to avoid such confuses methods should be renamed to 'waitUntilScreenRedy()' because the main goal is waiting for element's state and not assertion\r\n@anezboretskiy please find all those methods and fix naming" ]
[ "need to fix step description: 'Wait until page opened'", "@anezboretskiy better just delete description from annotation", "@VladSenyuta description deleted " ]
"2022-08-09T10:51:25Z"
[ "type/enhancement", "scope/QA", "status/accepted" ]
[e2e] Get rid of asserts in helper classes
[ "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/MainPage.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorCreateView.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorsList.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaView.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicView.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicsList.java" ]
[ "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/MainPage.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorCreateView.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorsList.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaView.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicView.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicsList.java" ]
[ "kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/SmokeTests.java", "kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/ConnectorsTests.java", "kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/SchemasTests.java", "kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/TopicTests.java" ]
diff --git a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/MainPage.java b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/MainPage.java index 7e2a6640a82..797d0277b9d 100644 --- a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/MainPage.java +++ b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/MainPage.java @@ -26,7 +26,7 @@ public MainPage goTo() { } @Step - public MainPage isOnPage() { + public MainPage waitUntilScreenReady() { $(By.xpath("//*[contains(text(),'Loading')]")).shouldBe(Condition.disappear); $("input[name=switchRoundedDefault]").parent().$("span").shouldBe(Condition.visible); return this; diff --git a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorCreateView.java b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorCreateView.java index e87dcfe342f..69f67a9a4af 100644 --- a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorCreateView.java +++ b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorCreateView.java @@ -34,8 +34,8 @@ public ConnectorsView setConnectorConfig(String connectName, String configJson) return new ConnectorsView(); } - @Step("Verify that page 'Create Connector' opened") - public ConnectorCreateView isOnConnectorCreatePage() { + @Step + public ConnectorCreateView waitUntilScreenReady() { nameField.shouldBe(Condition.visible); return this; } diff --git a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorsList.java b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorsList.java index 13c2ce2b70f..ed4b045caef 100644 --- a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorsList.java +++ b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorsList.java @@ -24,7 +24,7 @@ public ConnectorsList goTo(String cluster) { } @Step - public ConnectorsList isOnPage() { + public ConnectorsList waitUntilScreenReady() { $(By.xpath("//h1[text()='Connectors']")).shouldBe(Condition.visible); return this; } diff --git a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaView.java b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaView.java index cce7ecefc4c..a2b715f4f18 100644 --- a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaView.java +++ b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaView.java @@ -12,7 +12,7 @@ public class SchemaView { @Step - public SchemaView isOnSchemaViewPage() { + public SchemaView waitUntilScreenReady() { $("div#schema").shouldBe(Condition.visible); return this; } diff --git a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicView.java b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicView.java index a71d49e0f51..9e667df7197 100644 --- a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicView.java +++ b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicView.java @@ -29,7 +29,7 @@ public TopicView goTo(String cluster, String topic) { } @Step - public TopicView isOnTopicViewPage() { + public TopicView waitUntilScreenReady() { $(By.linkText("Overview")).shouldBe(Condition.visible); return this; } diff --git a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicsList.java b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicsList.java index 5b816d05f5c..ccdf4b7a110 100644 --- a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicsList.java +++ b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicsList.java @@ -25,7 +25,7 @@ public TopicsList goTo(String cluster) { } @Step - public TopicsList isOnPage() { + public TopicsList waitUntilScreenReady() { $(By.xpath("//*[contains(text(),'Loading')]")).shouldBe(Condition.disappear); $(By.xpath("//h1[text()='All Topics']")).shouldBe(Condition.visible); return this;
diff --git a/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/SmokeTests.java b/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/SmokeTests.java index e52d9dc1d87..149acf4aa56 100644 --- a/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/SmokeTests.java +++ b/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/SmokeTests.java @@ -16,7 +16,7 @@ public class SmokeTests extends BaseTest { @DisplayName("main page should load") void mainPageLoads() { pages.open() - .isOnPage(); + .waitUntilScreenReady(); compareScreenshots("main"); } diff --git a/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/ConnectorsTests.java b/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/ConnectorsTests.java index 57002ee5b6b..fe0e468e45f 100644 --- a/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/ConnectorsTests.java +++ b/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/ConnectorsTests.java @@ -70,14 +70,14 @@ public static void afterAll() { @Test public void createConnector() { pages.openConnectorsList(LOCAL_CLUSTER) - .isOnPage() + .waitUntilScreenReady() .clickCreateConnectorButton() - .isOnConnectorCreatePage() + .waitUntilScreenReady() .setConnectorConfig( SINK_CONNECTOR, FileUtils.getResourceAsString("config_for_create_connector.json")); pages.openConnectorsList(LOCAL_CLUSTER) - .isOnPage() + .waitUntilScreenReady() .connectorIsVisibleInList(SINK_CONNECTOR, TOPIC_FOR_CONNECTOR); } @@ -89,7 +89,7 @@ public void createConnector() { @Test public void updateConnector() { pages.openConnectorsList(LOCAL_CLUSTER) - .isOnPage() + .waitUntilScreenReady() .openConnector(CONNECTOR_FOR_UPDATE); pages.connectorsView.connectorIsVisibleOnOverview(); pages.connectorsView.openEditConfig() @@ -106,7 +106,7 @@ public void updateConnector() { @Test public void deleteConnector() { pages.openConnectorsList(LOCAL_CLUSTER) - .isOnPage() + .waitUntilScreenReady() .openConnector(CONNECTOR_FOR_DELETE); pages.connectorsView.clickDeleteButton(); pages.openConnectorsList(LOCAL_CLUSTER) diff --git a/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/SchemasTests.java b/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/SchemasTests.java index d7f1bf9c0c6..af14f108198 100644 --- a/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/SchemasTests.java +++ b/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/SchemasTests.java @@ -71,7 +71,7 @@ void createSchemaAvro() throws IOException { .setSchemaField(readFileAsString(PATH_AVRO_VALUE)) .selectSchemaTypeFromDropdown(SchemaCreateView.SchemaType.AVRO) .clickSubmit() - .isOnSchemaViewPage(); + .waitUntilScreenReady(); pages.mainPage .goToSideMenu(SECOND_LOCAL, MainPage.SideMenuOptions.SCHEMA_REGISTRY); pages.schemaRegistry.isSchemaVisible(SCHEMA_AVRO_CREATE); @@ -88,12 +88,12 @@ void updateSchemaAvro() { pages.openMainPage() .goToSideMenu(SECOND_LOCAL, MainPage.SideMenuOptions.SCHEMA_REGISTRY); pages.schemaRegistry.openSchema(SCHEMA_AVRO_API_UPDATE) - .isOnSchemaViewPage() + .waitUntilScreenReady() .openEditSchema() .selectCompatibilityLevelFromDropdown(CompatibilityLevel.CompatibilityEnum.NONE) .setNewSchemaValue(readFileAsString(PATH_AVRO_FOR_UPDATE)) .clickSubmit() - .isOnSchemaViewPage() + .waitUntilScreenReady() .isCompatibility(CompatibilityLevel.CompatibilityEnum.NONE); } @@ -108,7 +108,7 @@ void deleteSchemaAvro() { pages.openMainPage() .goToSideMenu(SECOND_LOCAL, MainPage.SideMenuOptions.SCHEMA_REGISTRY); pages.schemaRegistry.openSchema(SCHEMA_AVRO_API) - .isOnSchemaViewPage() + .waitUntilScreenReady() .removeSchema() .isNotVisible(SCHEMA_AVRO_API); } @@ -128,7 +128,7 @@ void createSchemaJson() { .setSchemaField(readFileAsString(PATH_JSON_VALUE)) .selectSchemaTypeFromDropdown(SchemaCreateView.SchemaType.JSON) .clickSubmit() - .isOnSchemaViewPage(); + .waitUntilScreenReady(); pages.mainPage .goToSideMenu(SECOND_LOCAL, MainPage.SideMenuOptions.SCHEMA_REGISTRY); pages.schemaRegistry.isSchemaVisible(SCHEMA_JSON_CREATE); @@ -145,7 +145,7 @@ void deleteSchemaJson() { pages.openMainPage() .goToSideMenu(SECOND_LOCAL, MainPage.SideMenuOptions.SCHEMA_REGISTRY); pages.schemaRegistry.openSchema(SCHEMA_JSON_API) - .isOnSchemaViewPage() + .waitUntilScreenReady() .removeSchema() .isNotVisible(SCHEMA_JSON_API); } @@ -165,7 +165,7 @@ void createSchemaProtobuf() { .setSchemaField(readFileAsString(PATH_PROTOBUF_VALUE)) .selectSchemaTypeFromDropdown(SchemaCreateView.SchemaType.PROTOBUF) .clickSubmit() - .isOnSchemaViewPage(); + .waitUntilScreenReady(); pages.mainPage .goToSideMenu(SECOND_LOCAL, MainPage.SideMenuOptions.SCHEMA_REGISTRY); pages.schemaRegistry.isSchemaVisible(SCHEMA_PROTOBUF_CREATE); @@ -182,7 +182,7 @@ void deleteSchemaProtobuf() { pages.openMainPage() .goToSideMenu(SECOND_LOCAL, MainPage.SideMenuOptions.SCHEMA_REGISTRY); pages.schemaRegistry.openSchema(SCHEMA_PROTOBUF_API) - .isOnSchemaViewPage() + .waitUntilScreenReady() .removeSchema() .isNotVisible(SCHEMA_PROTOBUF_API); } diff --git a/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/TopicTests.java b/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/TopicTests.java index 75e5cfa7224..0d6026e1edb 100644 --- a/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/TopicTests.java +++ b/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/TopicTests.java @@ -55,7 +55,7 @@ public void createTopic() { pages.topicsList.pressCreateNewTopic() .setTopicName(NEW_TOPIC) .sendData() - .isOnTopicViewPage(); + .waitUntilScreenReady(); pages.open() .goToSideMenu(SECOND_LOCAL, MainPage.SideMenuOptions.TOPICS) .topicIsVisible(NEW_TOPIC); @@ -74,9 +74,9 @@ public void createTopic() { @Test public void updateTopic() { pages.openTopicsList(SECOND_LOCAL) - .isOnPage(); + .waitUntilScreenReady(); pages.openTopicView(SECOND_LOCAL, TOPIC_TO_UPDATE) - .isOnTopicViewPage() + .waitUntilScreenReady() .openEditSettings() .selectCleanupPolicy(COMPACT_POLICY_VALUE) .setMinInsyncReplicas(10) @@ -84,10 +84,10 @@ public void updateTopic() { .setMaxSizeOnDiskInGB(UPDATED_MAX_SIZE_ON_DISK) .setMaxMessageBytes(UPDATED_MAX_MESSAGE_BYTES) .sendData() - .isOnTopicViewPage(); + .waitUntilScreenReady(); pages.openTopicsList(SECOND_LOCAL) - .isOnPage(); + .waitUntilScreenReady(); pages.openTopicView(SECOND_LOCAL, TOPIC_TO_UPDATE) .openEditSettings() // Assertions @@ -105,11 +105,11 @@ public void updateTopic() { @Test public void deleteTopic() { pages.openTopicsList(SECOND_LOCAL) - .isOnPage() + .waitUntilScreenReady() .openTopic(TOPIC_TO_DELETE) - .isOnTopicViewPage() + .waitUntilScreenReady() .deleteTopic() - .isOnPage() + .waitUntilScreenReady() .isTopicNotVisible(TOPIC_TO_DELETE); } @@ -121,9 +121,9 @@ public void deleteTopic() { @Test void produceMessage() { pages.openTopicsList(SECOND_LOCAL) - .isOnPage() + .waitUntilScreenReady() .openTopic(TOPIC_TO_UPDATE) - .isOnTopicViewPage() + .waitUntilScreenReady() .openTopicMenu(TopicView.TopicMenu.MESSAGES) .clickOnButton("Produce Message") .setContentFiled(readFileAsString(CONTENT_TO_PRODUCE_MESSAGE))
train
train
2022-08-09T15:53:31
"2022-07-18T15:39:45Z"
Haarolean
train
provectus/kafka-ui/2420_2422
provectus/kafka-ui
provectus/kafka-ui/2420
provectus/kafka-ui/2422
[ "keyword_pr_to_issue" ]
9b3495a2e9e2c92b8a073bc4ed835a0c30b66b27
5fdcd2124cce0cec8eaf7e09416fff55259eace4
[]
[]
"2022-08-10T14:27:03Z"
[ "scope/frontend", "status/accepted", "type/chore" ]
Typo within Topic's statistics
**Describe the bug** The word "unique" is written wrong within Topic's statistics page **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** Steps to reproduce the behavior: 1. Navigate to Topics Statistics page 2. Start Analysis 3. Check the "Messages" part **Expected behavior** The "Unique keys" and "Unique values" should be displayed **Screenshots** <img width="1726" alt="unique" src="https://user-images.githubusercontent.com/104780608/183740313-a15f00a4-3a96-4645-abe7-2e929c18dc26.png">
[ "kafka-ui-react-app/src/components/Topics/Topic/Details/Statistics/Indicators/Total.tsx" ]
[ "kafka-ui-react-app/src/components/Topics/Topic/Details/Statistics/Indicators/Total.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/Topics/Topic/Details/Statistics/Indicators/Total.tsx b/kafka-ui-react-app/src/components/Topics/Topic/Details/Statistics/Indicators/Total.tsx index dec863c0005..f2aa53141e8 100644 --- a/kafka-ui-react-app/src/components/Topics/Topic/Details/Statistics/Indicators/Total.tsx +++ b/kafka-ui-react-app/src/components/Topics/Topic/Details/Statistics/Indicators/Total.tsx @@ -24,14 +24,14 @@ const Total: React.FC<TopicAnalysisStats> = ({ </Metrics.Indicator> <Metrics.Indicator label="Null keys">{nullKeys}</Metrics.Indicator> <Metrics.Indicator - label="Uniq keys" + label="Unique keys" title="Approximate number of unique keys" > {approxUniqKeys} </Metrics.Indicator> <Metrics.Indicator label="Null values">{nullValues}</Metrics.Indicator> <Metrics.Indicator - label="Uniq values" + label="Unique values" title="Approximate number of unique values" > {approxUniqValues}
null
test
train
2022-08-10T11:58:41
"2022-08-09T19:04:13Z"
armenuikafka
train
provectus/kafka-ui/1666_2424
provectus/kafka-ui
provectus/kafka-ui/1666
provectus/kafka-ui/2424
[ "keyword_pr_to_issue", "timestamp(timedelta=1.0, similarity=0.9741795151534492)" ]
9b3495a2e9e2c92b8a073bc4ed835a0c30b66b27
a665fb4d83041857995a5b47eb307c70a3ae289b
[ "I can take this one. What about this solution? specifying \"api-prod\" as arg.\r\n```\r\ncd kafka-ui-api\r\n ./mvnw compile -Papi-prod \r\n```\r\n\r\nActually the ` ./mvnw compile -Pprod` doesn't compile the \"kafka-ui-contract\" (or am I wrong?). So when compiling the \"kafka-ui-api\" if you dont compile the contract you can receive some erros (if the contracts were updated). \r\n\r\nMaybe this is an another issue?", "Sure, thanks.\r\nYeah, let's do something like this.\r\nIt depends on the directory you run this command in.\r\n", "Since we don't have the wrapper as part of the modules what is the approach we want to follow regarding this request? Do we want to skip frontend build for ui-api and ui-contract?\r\n\r\nBy running something like this ./mvnw clean install -Pprod -DskipUIBuild=true\r\n\r\nAnd changing this on both POM's: \r\n\r\n**configuration.skip.skipUIBuild**\r\n\r\nOriginal Request is based on this possibility but it is not anymore like this:\r\n\r\ncd kafka-ui-api\r\n ./mvnw compile -Pprod \r\n \r\n @Haarolean ", "> Since we don't have the wrapper as part of the modules what is the approach we want to follow regarding this request? Do we want to skip frontend build for ui-api and ui-contract?\r\n> \r\n> By running something like this ./mvnw clean install -Pprod -DskipUIBuild=true\r\n> \r\n> And changing this on both POM's:\r\n> \r\n> **configuration.skip.skipUIBuild**\r\n> \r\n> Original Request is based on this possibility but it is not anymore like this:\r\n> \r\n> cd kafka-ui-api ./mvnw compile -Pprod\r\n> \r\n> @Haarolean\r\n\r\nwhat if we create a new profile with skipUIBuild property set to enabled by default?", "> > Since we don't have the wrapper as part of the modules what is the approach we want to follow regarding this request? Do we want to skip frontend build for ui-api and ui-contract?\r\n> > By running something like this ./mvnw clean install -Pprod -DskipUIBuild=true\r\n> > And changing this on both POM's:\r\n> > **configuration.skip.skipUIBuild**\r\n> > Original Request is based on this possibility but it is not anymore like this:\r\n> > cd kafka-ui-api ./mvnw compile -Pprod\r\n> > @Haarolean\r\n> \r\n> what if we create a new profile with skipUIBuild property set to enabled by default?\r\n\r\nThat is what the closed PR was doing and was pushed back due to duplicated code. ", "@Haarolean Lets not create unnecessary profiles. I suggested simple solution here https://github.com/provectus/kafka-ui/pull/1696#discussion_r830601917 . \r\nIn should work like `./mvnw compile -DskipUiBuild=true`. ", "> @Haarolean Lets not create unnecessary profiles. I suggested simple solution here [#1696 (comment)](https://github.com/provectus/kafka-ui/pull/1696#discussion_r830601917) . In should work like `./mvnw compile -DskipUiBuild=true`.\r\n\r\nDo we want it just for api or also for contract project?", "> @Haarolean Lets not create unnecessary profiles. I suggested simple solution here [#1696 (comment)](https://github.com/provectus/kafka-ui/pull/1696#discussion_r830601917) . In should work like `./mvnw compile -DskipUiBuild=true`.\r\n\r\nhttps://github.com/provectus/kafka-ui/pull/2424 This is solution 2" ]
[]
"2022-08-10T17:56:19Z"
[ "type/enhancement", "scope/backend", "status/accepted", "type/chore" ]
Request: a command to compile just `kafka-ui-api`, ignoring the frontend
### Is your proposal related to a problem? I'd like to be able to iterate building the Java portion of `kafka-ui` without building the frontend. I assumed this would work, but it runs npm: ``` cd kafka-ui-api ./mvnw compile -Pprod ``` I was told to open this issue on Discord. ### Describe the solution you'd like ``` cd kafka-ui-api ./mvnw compile -Pprod ``` The above should compile just the backend.
[ "documentation/project/contributing/building.md", "kafka-ui-api/pom.xml" ]
[ "documentation/project/contributing/building.md", "kafka-ui-api/pom.xml" ]
[]
diff --git a/documentation/project/contributing/building.md b/documentation/project/contributing/building.md index 21562426e5e..7b530829b92 100644 --- a/documentation/project/contributing/building.md +++ b/documentation/project/contributing/building.md @@ -19,6 +19,13 @@ docker-compose -f ./documentation/compose/kafka-clusters-only.yaml up -d Then, start the app. +### Building only the API + +To build only the kafka-ui-api you can use this command: +```sh +./mvnw -f kafka-ui-api/pom.xml clean install -Pprod -DskipUIBuild=true +``` + ## Where to go next In the next section, you'll [learn how to run the application](running.md). \ No newline at end of file diff --git a/kafka-ui-api/pom.xml b/kafka-ui-api/pom.xml index 53eb24731d7..5c00dc0f6d9 100644 --- a/kafka-ui-api/pom.xml +++ b/kafka-ui-api/pom.xml @@ -404,6 +404,7 @@ <artifactId>frontend-maven-plugin</artifactId> <version>${frontend-maven-plugin.version}</version> <configuration> + <skip>${skipUIBuild}</skip> <workingDirectory>../kafka-ui-react-app</workingDirectory> <environmentVariables> <VITE_TAG>${project.version}</VITE_TAG>
null
train
train
2022-08-10T11:58:41
"2022-02-22T22:53:16Z"
ottaviohartman
train
provectus/kafka-ui/2203_2425
provectus/kafka-ui
provectus/kafka-ui/2203
provectus/kafka-ui/2425
[ "keyword_pr_to_issue", "timestamp(timedelta=1.0, similarity=0.8707543666410622)" ]
9b3495a2e9e2c92b8a073bc4ed835a0c30b66b27
24243e36acd288123c15b1c9b83172b6f04f9d32
[]
[]
"2022-08-10T21:18:20Z"
[ "type/bug", "good first issue", "scope/frontend", "status/accepted", "status/confirmed" ]
In case of using "," symbol system doesn't give the validation message for Topic name field
**Describe the bug** In case of using "," symbol system doesn't give the validation message for Topic name field **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** Steps to reproduce the behavior: 1. Navigate to Topics 2. Add a Topic 3. Input the ";" symbol into Topic name field 4. Make sure validation message appears "Only alphanumeric, _, -, and . allowed" 5. Input the "," symbol into Topic name **Expected behavior** Validation message about not allowed symbol is not displayed **Screenshots** <img width="1718" alt="topic name" src="https://user-images.githubusercontent.com/104780608/175309599-2fd9636e-c005-472e-9c8b-6ffa91398276.png">
[ "kafka-ui-react-app/src/components/Topics/New/__test__/New.spec.tsx", "kafka-ui-react-app/src/lib/constants.ts", "kafka-ui-react-app/src/lib/yupExtended.ts" ]
[ "kafka-ui-react-app/src/components/Topics/New/__test__/New.spec.tsx", "kafka-ui-react-app/src/lib/constants.ts", "kafka-ui-react-app/src/lib/yupExtended.ts" ]
[]
diff --git a/kafka-ui-react-app/src/components/Topics/New/__test__/New.spec.tsx b/kafka-ui-react-app/src/components/Topics/New/__test__/New.spec.tsx index a15ef31f680..0e01da66cb6 100644 --- a/kafka-ui-react-app/src/components/Topics/New/__test__/New.spec.tsx +++ b/kafka-ui-react-app/src/components/Topics/New/__test__/New.spec.tsx @@ -77,6 +77,18 @@ describe('New', () => { expect(mockNavigate).not.toHaveBeenCalled(); }); + it('validates form invalid name', async () => { + await act(() => renderComponent(clusterTopicNewPath(clusterName))); + await waitFor(() => { + userEvent.type(screen.getByPlaceholderText('Topic Name'), 'Invalid,Name'); + }); + await waitFor(() => { + expect( + screen.getByText('Only alphanumeric, _, -, and . allowed') + ).toBeInTheDocument(); + }); + }); + it('submits valid form', async () => { await act(() => renderComponent(clusterTopicNewPath(clusterName))); await act(() => { diff --git a/kafka-ui-react-app/src/lib/constants.ts b/kafka-ui-react-app/src/lib/constants.ts index 0be9810babe..5626c06ee34 100644 --- a/kafka-ui-react-app/src/lib/constants.ts +++ b/kafka-ui-react-app/src/lib/constants.ts @@ -15,7 +15,7 @@ export const BASE_PARAMS: ConfigurationParameters = { }, }; -export const TOPIC_NAME_VALIDATION_PATTERN = /^[.,A-Za-z0-9_-]+$/; +export const TOPIC_NAME_VALIDATION_PATTERN = /^[a-zA-Z0-9._-]+$/; export const SCHEMA_NAME_VALIDATION_PATTERN = /^[.,A-Za-z0-9_/-]+$/; export const TOPIC_CUSTOM_PARAMS_PREFIX = 'customParams'; diff --git a/kafka-ui-react-app/src/lib/yupExtended.ts b/kafka-ui-react-app/src/lib/yupExtended.ts index ca51e662f15..d9c476aa76a 100644 --- a/kafka-ui-react-app/src/lib/yupExtended.ts +++ b/kafka-ui-react-app/src/lib/yupExtended.ts @@ -47,6 +47,7 @@ export default yup; export const topicFormValidationSchema = yup.object().shape({ name: yup .string() + .max(249) .required() .matches( TOPIC_NAME_VALIDATION_PATTERN,
null
train
train
2022-08-10T11:58:41
"2022-06-23T13:47:58Z"
armenuikafka
train
provectus/kafka-ui/2445_2447
provectus/kafka-ui
provectus/kafka-ui/2445
provectus/kafka-ui/2447
[ "timestamp(timedelta=1.0, similarity=0.8890880368006049)", "connected" ]
91b86b5b780eb6c090c89eebd405f3487147e6da
c8306f59700f574d6e68f31a121f626440b47a20
[]
[ "This empty line could be removed :).", "Instead of hardcoding `CONNECT_REST_PORT`, we can use `CONNECT_PORT` defined already in the class.", "I like the spacing like separating logical blocks", "sure" ]
"2022-08-15T07:53:06Z"
[ "type/bug", "scope/backend", "status/accepted", "status/confirmed" ]
Consumers sorting is broken
<!-- Don't forget to check for existing issues/discussions regarding your proposal. We might already have it. https://github.com/provectus/kafka-ui/issues https://github.com/provectus/kafka-ui/discussions --> **Describe the bug** <!--(A clear and concise description of what the bug is.)--> /api/clusters/local/consumer-groups/paged?page=1&perPage=25&search=&orderBy=MEMBERS&sortOrder=DESC returns payload in ASC order **Set up** <!-- How do you run the app? Please provide as much info as possible: 1. App version (docker image version or check commit hash in the top left corner in UI) 2. Helm chart version, if you use one 3. Any IAAC configs We might close the issue without further explanation if you don't provide such information. --> **Steps to Reproduce** <!-- We'd like you to provide an example setup (via docker-compose, helm, etc.) to reproduce the problem, especially with a complex setups. --> Steps to reproduce the behavior: 1. **Expected behavior** <!-- (A clear and concise description of what you expected to happen) --> **Screenshots** <!-- (If applicable, add screenshots to help explain your problem) --> **Additional context** <!-- (Add any other context about the problem here) -->
[ "kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ConsumerGroupService.java" ]
[ "kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ConsumerGroupService.java" ]
[ "kafka-ui-api/src/test/java/com/provectus/kafka/ui/KafkaConsumerGroupTests.java", "kafka-ui-api/src/test/java/com/provectus/kafka/ui/container/KafkaConnectContainer.java" ]
diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ConsumerGroupService.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ConsumerGroupService.java index beb9f849798..d17f5249781 100644 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ConsumerGroupService.java +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ConsumerGroupService.java @@ -161,7 +161,7 @@ private Comparator<ConsumerGroupDescription> getPaginationComparator(ConsumerGro }; return Comparator.comparingInt(statesPriorities); case MEMBERS: - return Comparator.comparingInt(cg -> -cg.members().size()); + return Comparator.comparingInt(cg -> cg.members().size()); default: throw new IllegalStateException("Unsupported order by: " + orderBy); }
diff --git a/kafka-ui-api/src/test/java/com/provectus/kafka/ui/KafkaConsumerGroupTests.java b/kafka-ui-api/src/test/java/com/provectus/kafka/ui/KafkaConsumerGroupTests.java index 17e7a19ee7f..98f8394060f 100644 --- a/kafka-ui-api/src/test/java/com/provectus/kafka/ui/KafkaConsumerGroupTests.java +++ b/kafka-ui-api/src/test/java/com/provectus/kafka/ui/KafkaConsumerGroupTests.java @@ -14,6 +14,7 @@ import java.util.stream.Stream; import lombok.extern.slf4j.Slf4j; import lombok.val; +import org.apache.commons.lang3.RandomStringUtils; import org.apache.kafka.clients.admin.NewTopic; import org.apache.kafka.clients.consumer.ConsumerConfig; import org.apache.kafka.clients.consumer.KafkaConsumer; @@ -126,6 +127,21 @@ void shouldReturnConsumerGroupsWithPagination() throws Exception { assertThat(page.getConsumerGroups()) .isSortedAccordingTo(Comparator.comparing(ConsumerGroupDTO::getGroupId).reversed()); }); + + webTestClient + .get() + .uri("/api/clusters/{clusterName}/consumer-groups/paged?perPage=10&&search" + + "=cgPageTest&orderBy=MEMBERS&sortOrder=DESC", LOCAL) + .exchange() + .expectStatus() + .isOk() + .expectBody(ConsumerGroupsPageResponseDTO.class) + .value(page -> { + assertThat(page.getPageCount()).isEqualTo(1); + assertThat(page.getConsumerGroups().size()).isEqualTo(5); + assertThat(page.getConsumerGroups()) + .isSortedAccordingTo(Comparator.comparing(ConsumerGroupDTO::getMembers).reversed()); + }); } } @@ -133,7 +149,7 @@ private Closeable startConsumerGroups(int count, String consumerGroupPrefix) { String topicName = createTopicWithRandomName(); var consumers = Stream.generate(() -> { - String groupId = consumerGroupPrefix + UUID.randomUUID(); + String groupId = consumerGroupPrefix + RandomStringUtils.randomAlphabetic(5); val consumer = createTestConsumerWithGroupId(groupId); consumer.subscribe(List.of(topicName)); consumer.poll(Duration.ofMillis(100)); diff --git a/kafka-ui-api/src/test/java/com/provectus/kafka/ui/container/KafkaConnectContainer.java b/kafka-ui-api/src/test/java/com/provectus/kafka/ui/container/KafkaConnectContainer.java index dd8d5d03c41..b0098b307f5 100644 --- a/kafka-ui-api/src/test/java/com/provectus/kafka/ui/container/KafkaConnectContainer.java +++ b/kafka-ui-api/src/test/java/com/provectus/kafka/ui/container/KafkaConnectContainer.java @@ -12,6 +12,7 @@ public class KafkaConnectContainer extends GenericContainer<KafkaConnectContaine public KafkaConnectContainer(String version) { super("confluentinc/cp-kafka-connect:" + version); addExposedPort(CONNECT_PORT); + waitStrategy = Wait.forHttp("/") .withStartupTimeout(Duration.ofMinutes(5)); } @@ -37,6 +38,7 @@ public KafkaConnectContainer withKafka(Network network, String bootstrapServers) withEnv("CONNECT_INTERNAL_KEY_CONVERTER", "org.apache.kafka.connect.json.JsonConverter"); withEnv("CONNECT_INTERNAL_VALUE_CONVERTER", "org.apache.kafka.connect.json.JsonConverter"); withEnv("CONNECT_REST_ADVERTISED_HOST_NAME", "kafka-connect"); + withEnv("CONNECT_REST_PORT", String.valueOf(CONNECT_PORT)); withEnv("CONNECT_PLUGIN_PATH", "/usr/share/java,/usr/share/confluent-hub-components"); return self(); }
train
train
2022-09-01T16:57:36
"2022-08-14T17:32:34Z"
workshur
train
provectus/kafka-ui/2320_2455
provectus/kafka-ui
provectus/kafka-ui/2320
provectus/kafka-ui/2455
[ "keyword_pr_to_issue" ]
a5f539c62aff5f97d244927d63f185475397636c
95a030614337529ff3dd578bb03fe1de18e5da7e
[]
[ "let's try to hide expander for rows with empty trace. You can use `getRowCanExpand` to check if row.original.trace.length > 0 ", "let's add confirmation modal. just add `confirm` attr to DropdownItem", "```suggestion\r\n const trace = getValue<string>() || '';\r\n```" ]
"2022-08-16T21:02:33Z"
[ "type/enhancement", "good first issue", "scope/frontend", "status/accepted" ]
Hide KC connectors' stacktraces into a spoiler
Hide the stacktrace into a spoiler: <img width="1291" alt="image" src="https://user-images.githubusercontent.com/1494347/180452189-f6240195-5c9b-4156-be91-95d4cdd8517b.png"> like this: <img width="276" alt="image" src="https://user-images.githubusercontent.com/1494347/180452263-2e488170-9e68-4431-aa00-d9372bf8058d.png">
[ "kafka-ui-react-app/src/components/Connect/Details/Tasks/Tasks.tsx", "kafka-ui-react-app/src/components/Connect/Details/Tasks/__tests__/Tasks.spec.tsx", "kafka-ui-react-app/src/components/common/NewTable/ExpanderCell.tsx", "kafka-ui-react-app/src/components/common/NewTable/Table.styled.ts", "kafka-ui-react-app/src/components/common/NewTable/Table.tsx", "kafka-ui-react-app/src/lib/fixtures/kafkaConnect.ts", "kafka-ui-react-app/src/theme/theme.ts" ]
[ "kafka-ui-react-app/src/components/Connect/Details/Tasks/ActionsCellTasks.tsx", "kafka-ui-react-app/src/components/Connect/Details/Tasks/Tasks.tsx", "kafka-ui-react-app/src/components/Connect/Details/Tasks/__tests__/Tasks.spec.tsx", "kafka-ui-react-app/src/components/common/NewTable/ExpanderCell.tsx", "kafka-ui-react-app/src/components/common/NewTable/Table.styled.ts", "kafka-ui-react-app/src/components/common/NewTable/Table.tsx", "kafka-ui-react-app/src/lib/fixtures/kafkaConnect.ts", "kafka-ui-react-app/src/theme/theme.ts" ]
[]
diff --git a/kafka-ui-react-app/src/components/Connect/Details/Tasks/ActionsCellTasks.tsx b/kafka-ui-react-app/src/components/Connect/Details/Tasks/ActionsCellTasks.tsx new file mode 100644 index 00000000000..6d2cf845e1b --- /dev/null +++ b/kafka-ui-react-app/src/components/Connect/Details/Tasks/ActionsCellTasks.tsx @@ -0,0 +1,32 @@ +import React from 'react'; +import { Task } from 'generated-sources'; +import { CellContext } from '@tanstack/react-table'; +import useAppParams from 'lib/hooks/useAppParams'; +import { useRestartConnectorTask } from 'lib/hooks/api/kafkaConnect'; +import { Dropdown, DropdownItem } from 'components/common/Dropdown'; +import { RouterParamsClusterConnectConnector } from 'lib/paths'; + +const ActionsCellTasks: React.FC<CellContext<Task, unknown>> = ({ row }) => { + const { id } = row.original; + const routerProps = useAppParams<RouterParamsClusterConnectConnector>(); + const restartMutation = useRestartConnectorTask(routerProps); + + const restartTaskHandler = (taskId?: number) => { + if (taskId === undefined) return; + restartMutation.mutateAsync(taskId); + }; + + return ( + <Dropdown> + <DropdownItem + onClick={() => restartTaskHandler(id?.task)} + danger + confirm="Are you sure you want to restart the task?" + > + <span>Restart task</span> + </DropdownItem> + </Dropdown> + ); +}; + +export default ActionsCellTasks; diff --git a/kafka-ui-react-app/src/components/Connect/Details/Tasks/Tasks.tsx b/kafka-ui-react-app/src/components/Connect/Details/Tasks/Tasks.tsx index 74dd89ab873..bb21e895380 100644 --- a/kafka-ui-react-app/src/components/Connect/Details/Tasks/Tasks.tsx +++ b/kafka-ui-react-app/src/components/Connect/Details/Tasks/Tasks.tsx @@ -1,69 +1,58 @@ import React from 'react'; -import { Table } from 'components/common/table/Table/Table.styled'; -import TableHeaderCell from 'components/common/table/TableHeaderCell/TableHeaderCell'; -import { - useConnectorTasks, - useRestartConnectorTask, -} from 'lib/hooks/api/kafkaConnect'; +import { useConnectorTasks } from 'lib/hooks/api/kafkaConnect'; import useAppParams from 'lib/hooks/useAppParams'; import { RouterParamsClusterConnectConnector } from 'lib/paths'; -import getTagColor from 'components/common/Tag/getTagColor'; -import { Tag } from 'components/common/Tag/Tag.styled'; -import { Dropdown, DropdownItem } from 'components/common/Dropdown'; +import { ColumnDef, Row } from '@tanstack/react-table'; +import { Task } from 'generated-sources'; +import Table, { TagCell } from 'components/common/NewTable'; + +import ActionsCellTasks from './ActionsCellTasks'; + +const ExpandedTaskRow: React.FC<{ row: Row<Task> }> = ({ row }) => { + return <div>{row.original.status.trace}</div>; +}; + +const MAX_LENGTH = 100; const Tasks: React.FC = () => { const routerProps = useAppParams<RouterParamsClusterConnectConnector>(); - const { data: tasks } = useConnectorTasks(routerProps); - const restartMutation = useRestartConnectorTask(routerProps); + const { data = [] } = useConnectorTasks(routerProps); - const restartTaskHandler = (taskId?: number) => { - if (taskId === undefined) return; - restartMutation.mutateAsync(taskId); - }; + const columns = React.useMemo<ColumnDef<Task>[]>( + () => [ + { header: 'ID', accessorKey: 'status.id' }, + { header: 'Worker', accessorKey: 'status.workerId' }, + { header: 'State', accessorKey: 'status.state', cell: TagCell }, + { + header: 'Trace', + accessorKey: 'status.trace', + enableSorting: false, + cell: ({ getValue }) => { + const trace = getValue<string>() || ''; + return trace.toString().length > MAX_LENGTH + ? `${trace.toString().substring(0, MAX_LENGTH - 3)}...` + : trace; + }, + meta: { width: '70%' }, + }, + { + id: 'actions', + header: '', + cell: ActionsCellTasks, + }, + ], + [] + ); return ( - <Table isFullwidth> - <thead> - <tr> - <TableHeaderCell title="ID" /> - <TableHeaderCell title="Worker" /> - <TableHeaderCell title="State" /> - <TableHeaderCell title="Trace" /> - <TableHeaderCell /> - </tr> - </thead> - <tbody> - {tasks?.length === 0 && ( - <tr> - <td colSpan={10}>No tasks found</td> - </tr> - )} - {tasks?.map((task) => ( - <tr key={task.status?.id}> - <td>{task.status?.id}</td> - <td>{task.status?.workerId}</td> - <td> - <Tag color={getTagColor(task.status.state)}> - {task.status.state} - </Tag> - </td> - <td>{task.status.trace || 'null'}</td> - <td style={{ width: '5%' }}> - <div> - <Dropdown> - <DropdownItem - onClick={() => restartTaskHandler(task.id?.task)} - danger - > - <span>Restart task</span> - </DropdownItem> - </Dropdown> - </div> - </td> - </tr> - ))} - </tbody> - </Table> + <Table + columns={columns} + data={data} + emptyMessage="No tasks found" + enableSorting + getRowCanExpand={(row) => row.original.status.trace?.length > 0} + renderSubComponent={ExpandedTaskRow} + /> ); }; diff --git a/kafka-ui-react-app/src/components/Connect/Details/Tasks/__tests__/Tasks.spec.tsx b/kafka-ui-react-app/src/components/Connect/Details/Tasks/__tests__/Tasks.spec.tsx index efb72d1812c..da38068a203 100644 --- a/kafka-ui-react-app/src/components/Connect/Details/Tasks/__tests__/Tasks.spec.tsx +++ b/kafka-ui-react-app/src/components/Connect/Details/Tasks/__tests__/Tasks.spec.tsx @@ -3,8 +3,13 @@ import { render, WithRoute } from 'lib/testHelpers'; import { clusterConnectConnectorTasksPath } from 'lib/paths'; import Tasks from 'components/Connect/Details/Tasks/Tasks'; import { tasks } from 'lib/fixtures/kafkaConnect'; -import { screen } from '@testing-library/dom'; -import { useConnectorTasks } from 'lib/hooks/api/kafkaConnect'; +import { screen, within, waitFor } from '@testing-library/react'; +import userEvent from '@testing-library/user-event'; +import { + useConnectorTasks, + useRestartConnectorTask, +} from 'lib/hooks/api/kafkaConnect'; +import { Task } from 'generated-sources'; jest.mock('lib/hooks/api/kafkaConnect', () => ({ useConnectorTasks: jest.fn(), @@ -13,30 +18,109 @@ jest.mock('lib/hooks/api/kafkaConnect', () => ({ const path = clusterConnectConnectorTasksPath('local', 'ghp', '1'); +const restartConnectorMock = jest.fn(); + describe('Tasks', () => { - const renderComponent = () => + beforeEach(() => { + (useRestartConnectorTask as jest.Mock).mockImplementation(() => ({ + mutateAsync: restartConnectorMock, + })); + }); + + const renderComponent = (currentData: Task[] | undefined = undefined) => { + (useConnectorTasks as jest.Mock).mockImplementation(() => ({ + data: currentData, + })); + render( <WithRoute path={clusterConnectConnectorTasksPath()}> <Tasks /> </WithRoute>, { initialEntries: [path] } ); + }; it('renders empty table', () => { - (useConnectorTasks as jest.Mock).mockImplementation(() => ({ - data: [], - })); - renderComponent(); expect(screen.getByRole('table')).toBeInTheDocument(); expect(screen.getByText('No tasks found')).toBeInTheDocument(); }); it('renders tasks table', () => { - (useConnectorTasks as jest.Mock).mockImplementation(() => ({ - data: tasks, - })); - renderComponent(); + renderComponent(tasks); expect(screen.getAllByRole('row').length).toEqual(tasks.length + 1); + + expect( + screen.getByRole('row', { + name: '1 kafka-connect0:8083 RUNNING', + }) + ).toBeInTheDocument(); + }); + + it('renders truncates long trace and expands', () => { + renderComponent(tasks); + + const trace = tasks[2]?.status?.trace || ''; + const truncatedTrace = trace.toString().substring(0, 100 - 3); + + const thirdRow = screen.getByRole('row', { + name: `3 kafka-connect0:8083 RUNNING ${truncatedTrace}...`, + }); + expect(thirdRow).toBeInTheDocument(); + + const expandedDetails = screen.queryByText(trace); + // Full trace is not visible + expect(expandedDetails).not.toBeInTheDocument(); + + userEvent.click(thirdRow); + + expect( + screen.getByRole('row', { + name: trace, + }) + ).toBeInTheDocument(); + }); + + describe('Action button', () => { + const expectDropdownExists = () => { + const firstTaskRow = screen.getByRole('row', { + name: '1 kafka-connect0:8083 RUNNING', + }); + expect(firstTaskRow).toBeInTheDocument(); + const extBtn = within(firstTaskRow).getByRole('button', { + name: 'Dropdown Toggle', + }); + expect(extBtn).toBeEnabled(); + userEvent.click(extBtn); + expect(screen.getByRole('menu')).toBeInTheDocument(); + }; + + it('renders action button', () => { + renderComponent(tasks); + expectDropdownExists(); + expect( + screen.getAllByRole('button', { name: 'Dropdown Toggle' }).length + ).toEqual(tasks.length); + // Action buttons are enabled + const actionBtn = screen.getAllByRole('menuitem'); + expect(actionBtn[0]).toHaveTextContent('Restart task'); + }); + + it('works as expected', async () => { + renderComponent(tasks); + expectDropdownExists(); + const actionBtn = screen.getAllByRole('menuitem'); + expect(actionBtn[0]).toHaveTextContent('Restart task'); + + userEvent.click(actionBtn[0]); + expect( + screen.getByText('Are you sure you want to restart the task?') + ).toBeInTheDocument(); + + expect(screen.getByText('Confirm the action')).toBeInTheDocument(); + userEvent.click(screen.getByRole('button', { name: 'Confirm' })); + + await waitFor(() => expect(restartConnectorMock).toHaveBeenCalled()); + }); }); }); diff --git a/kafka-ui-react-app/src/components/common/NewTable/ExpanderCell.tsx b/kafka-ui-react-app/src/components/common/NewTable/ExpanderCell.tsx index d53c3d79028..ff5e501d527 100644 --- a/kafka-ui-react-app/src/components/common/NewTable/ExpanderCell.tsx +++ b/kafka-ui-react-app/src/components/common/NewTable/ExpanderCell.tsx @@ -12,6 +12,7 @@ const ExpanderCell: React.FC<CellContext<unknown, unknown>> = ({ row }) => ( xmlns="http://www.w3.org/2000/svg" role="button" aria-label="Expand row" + $disabled={!row.getCanExpand()} > {row.getIsExpanded() ? ( <path diff --git a/kafka-ui-react-app/src/components/common/NewTable/Table.styled.ts b/kafka-ui-react-app/src/components/common/NewTable/Table.styled.ts index 5fc02176ee5..75671beca90 100644 --- a/kafka-ui-react-app/src/components/common/NewTable/Table.styled.ts +++ b/kafka-ui-react-app/src/components/common/NewTable/Table.styled.ts @@ -1,14 +1,15 @@ -import styled from 'styled-components'; +import styled, { css } from 'styled-components'; -export const ExpaderButton = styled.svg( - ({ theme: { table } }) => ` - & > path { - fill: ${table.expander.normal}; - &:hover { - fill: ${table.expander.hover}; +export const ExpaderButton = styled.svg<{ $disabled: boolean }>( + ({ theme: { table }, $disabled }) => css` + & > path { + fill: ${table.expander[$disabled ? 'disabled' : 'normal']}; } - } -` + + &:hover > path { + fill: ${table.expander[$disabled ? 'disabled' : 'hover']}; + } + ` ); interface ThProps { diff --git a/kafka-ui-react-app/src/components/common/NewTable/Table.tsx b/kafka-ui-react-app/src/components/common/NewTable/Table.tsx index aaf02ea4082..3b3a780674c 100644 --- a/kafka-ui-react-app/src/components/common/NewTable/Table.tsx +++ b/kafka-ui-react-app/src/components/common/NewTable/Table.tsx @@ -246,15 +246,15 @@ const Table: React.FC<TableProps<any>> = ({ } > {!!enableRowSelection && ( - <td key={`${row.id}-select`}> + <td key={`${row.id}-select`} style={{ width: '1px' }}> {flexRender( SelectRowCell, row.getVisibleCells()[0].getContext() )} </td> )} - {row.getCanExpand() && ( - <td key={`${row.id}-expander`}> + {table.getCanSomeRowsExpand() && ( + <td key={`${row.id}-expander`} style={{ width: '1px' }}> {flexRender( ExpanderCell, row.getVisibleCells()[0].getContext() @@ -264,7 +264,9 @@ const Table: React.FC<TableProps<any>> = ({ {row .getVisibleCells() .map(({ id, getContext, column: { columnDef } }) => ( - <td key={id}>{flexRender(columnDef.cell, getContext())}</td> + <td key={id} style={columnDef.meta}> + {flexRender(columnDef.cell, getContext())} + </td> ))} </S.Row> {row.getIsExpanded() && renderSubComponent && ( diff --git a/kafka-ui-react-app/src/lib/fixtures/kafkaConnect.ts b/kafka-ui-react-app/src/lib/fixtures/kafkaConnect.ts index f885ea405df..8a79760e661 100644 --- a/kafka-ui-react-app/src/lib/fixtures/kafkaConnect.ts +++ b/kafka-ui-react-app/src/lib/fixtures/kafkaConnect.ts @@ -93,6 +93,8 @@ export const tasks: Task[] = [ id: 3, state: ConnectorTaskStatus.RUNNING, workerId: 'kafka-connect0:8083', + trace: + 'Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.', }, config: { 'batch.size': '3000', diff --git a/kafka-ui-react-app/src/theme/theme.ts b/kafka-ui-react-app/src/theme/theme.ts index 4c057645ef1..a962b2a5356 100644 --- a/kafka-ui-react-app/src/theme/theme.ts +++ b/kafka-ui-react-app/src/theme/theme.ts @@ -346,6 +346,7 @@ const theme = { expander: { normal: Colors.brand[50], hover: Colors.brand[20], + disabled: Colors.neutral[10], }, }, primaryTab: {
null
train
train
2022-08-18T10:16:00
"2022-07-22T13:45:11Z"
Haarolean
train
provectus/kafka-ui/2321_2455
provectus/kafka-ui
provectus/kafka-ui/2321
provectus/kafka-ui/2455
[ "keyword_pr_to_issue" ]
a5f539c62aff5f97d244927d63f185475397636c
95a030614337529ff3dd578bb03fe1de18e5da7e
[]
[ "let's try to hide expander for rows with empty trace. You can use `getRowCanExpand` to check if row.original.trace.length > 0 ", "let's add confirmation modal. just add `confirm` attr to DropdownItem", "```suggestion\r\n const trace = getValue<string>() || '';\r\n```" ]
"2022-08-16T21:02:33Z"
[ "type/enhancement", "good first issue", "scope/frontend", "status/accepted" ]
Do not display "null" as a stacktrace in KC
Make it an empty space <img width="689" alt="image" src="https://user-images.githubusercontent.com/1494347/180452422-6e0b8980-f470-4e87-881f-6ec3af2cbf07.png">
[ "kafka-ui-react-app/src/components/Connect/Details/Tasks/Tasks.tsx", "kafka-ui-react-app/src/components/Connect/Details/Tasks/__tests__/Tasks.spec.tsx", "kafka-ui-react-app/src/components/common/NewTable/ExpanderCell.tsx", "kafka-ui-react-app/src/components/common/NewTable/Table.styled.ts", "kafka-ui-react-app/src/components/common/NewTable/Table.tsx", "kafka-ui-react-app/src/lib/fixtures/kafkaConnect.ts", "kafka-ui-react-app/src/theme/theme.ts" ]
[ "kafka-ui-react-app/src/components/Connect/Details/Tasks/ActionsCellTasks.tsx", "kafka-ui-react-app/src/components/Connect/Details/Tasks/Tasks.tsx", "kafka-ui-react-app/src/components/Connect/Details/Tasks/__tests__/Tasks.spec.tsx", "kafka-ui-react-app/src/components/common/NewTable/ExpanderCell.tsx", "kafka-ui-react-app/src/components/common/NewTable/Table.styled.ts", "kafka-ui-react-app/src/components/common/NewTable/Table.tsx", "kafka-ui-react-app/src/lib/fixtures/kafkaConnect.ts", "kafka-ui-react-app/src/theme/theme.ts" ]
[]
diff --git a/kafka-ui-react-app/src/components/Connect/Details/Tasks/ActionsCellTasks.tsx b/kafka-ui-react-app/src/components/Connect/Details/Tasks/ActionsCellTasks.tsx new file mode 100644 index 00000000000..6d2cf845e1b --- /dev/null +++ b/kafka-ui-react-app/src/components/Connect/Details/Tasks/ActionsCellTasks.tsx @@ -0,0 +1,32 @@ +import React from 'react'; +import { Task } from 'generated-sources'; +import { CellContext } from '@tanstack/react-table'; +import useAppParams from 'lib/hooks/useAppParams'; +import { useRestartConnectorTask } from 'lib/hooks/api/kafkaConnect'; +import { Dropdown, DropdownItem } from 'components/common/Dropdown'; +import { RouterParamsClusterConnectConnector } from 'lib/paths'; + +const ActionsCellTasks: React.FC<CellContext<Task, unknown>> = ({ row }) => { + const { id } = row.original; + const routerProps = useAppParams<RouterParamsClusterConnectConnector>(); + const restartMutation = useRestartConnectorTask(routerProps); + + const restartTaskHandler = (taskId?: number) => { + if (taskId === undefined) return; + restartMutation.mutateAsync(taskId); + }; + + return ( + <Dropdown> + <DropdownItem + onClick={() => restartTaskHandler(id?.task)} + danger + confirm="Are you sure you want to restart the task?" + > + <span>Restart task</span> + </DropdownItem> + </Dropdown> + ); +}; + +export default ActionsCellTasks; diff --git a/kafka-ui-react-app/src/components/Connect/Details/Tasks/Tasks.tsx b/kafka-ui-react-app/src/components/Connect/Details/Tasks/Tasks.tsx index 74dd89ab873..bb21e895380 100644 --- a/kafka-ui-react-app/src/components/Connect/Details/Tasks/Tasks.tsx +++ b/kafka-ui-react-app/src/components/Connect/Details/Tasks/Tasks.tsx @@ -1,69 +1,58 @@ import React from 'react'; -import { Table } from 'components/common/table/Table/Table.styled'; -import TableHeaderCell from 'components/common/table/TableHeaderCell/TableHeaderCell'; -import { - useConnectorTasks, - useRestartConnectorTask, -} from 'lib/hooks/api/kafkaConnect'; +import { useConnectorTasks } from 'lib/hooks/api/kafkaConnect'; import useAppParams from 'lib/hooks/useAppParams'; import { RouterParamsClusterConnectConnector } from 'lib/paths'; -import getTagColor from 'components/common/Tag/getTagColor'; -import { Tag } from 'components/common/Tag/Tag.styled'; -import { Dropdown, DropdownItem } from 'components/common/Dropdown'; +import { ColumnDef, Row } from '@tanstack/react-table'; +import { Task } from 'generated-sources'; +import Table, { TagCell } from 'components/common/NewTable'; + +import ActionsCellTasks from './ActionsCellTasks'; + +const ExpandedTaskRow: React.FC<{ row: Row<Task> }> = ({ row }) => { + return <div>{row.original.status.trace}</div>; +}; + +const MAX_LENGTH = 100; const Tasks: React.FC = () => { const routerProps = useAppParams<RouterParamsClusterConnectConnector>(); - const { data: tasks } = useConnectorTasks(routerProps); - const restartMutation = useRestartConnectorTask(routerProps); + const { data = [] } = useConnectorTasks(routerProps); - const restartTaskHandler = (taskId?: number) => { - if (taskId === undefined) return; - restartMutation.mutateAsync(taskId); - }; + const columns = React.useMemo<ColumnDef<Task>[]>( + () => [ + { header: 'ID', accessorKey: 'status.id' }, + { header: 'Worker', accessorKey: 'status.workerId' }, + { header: 'State', accessorKey: 'status.state', cell: TagCell }, + { + header: 'Trace', + accessorKey: 'status.trace', + enableSorting: false, + cell: ({ getValue }) => { + const trace = getValue<string>() || ''; + return trace.toString().length > MAX_LENGTH + ? `${trace.toString().substring(0, MAX_LENGTH - 3)}...` + : trace; + }, + meta: { width: '70%' }, + }, + { + id: 'actions', + header: '', + cell: ActionsCellTasks, + }, + ], + [] + ); return ( - <Table isFullwidth> - <thead> - <tr> - <TableHeaderCell title="ID" /> - <TableHeaderCell title="Worker" /> - <TableHeaderCell title="State" /> - <TableHeaderCell title="Trace" /> - <TableHeaderCell /> - </tr> - </thead> - <tbody> - {tasks?.length === 0 && ( - <tr> - <td colSpan={10}>No tasks found</td> - </tr> - )} - {tasks?.map((task) => ( - <tr key={task.status?.id}> - <td>{task.status?.id}</td> - <td>{task.status?.workerId}</td> - <td> - <Tag color={getTagColor(task.status.state)}> - {task.status.state} - </Tag> - </td> - <td>{task.status.trace || 'null'}</td> - <td style={{ width: '5%' }}> - <div> - <Dropdown> - <DropdownItem - onClick={() => restartTaskHandler(task.id?.task)} - danger - > - <span>Restart task</span> - </DropdownItem> - </Dropdown> - </div> - </td> - </tr> - ))} - </tbody> - </Table> + <Table + columns={columns} + data={data} + emptyMessage="No tasks found" + enableSorting + getRowCanExpand={(row) => row.original.status.trace?.length > 0} + renderSubComponent={ExpandedTaskRow} + /> ); }; diff --git a/kafka-ui-react-app/src/components/Connect/Details/Tasks/__tests__/Tasks.spec.tsx b/kafka-ui-react-app/src/components/Connect/Details/Tasks/__tests__/Tasks.spec.tsx index efb72d1812c..da38068a203 100644 --- a/kafka-ui-react-app/src/components/Connect/Details/Tasks/__tests__/Tasks.spec.tsx +++ b/kafka-ui-react-app/src/components/Connect/Details/Tasks/__tests__/Tasks.spec.tsx @@ -3,8 +3,13 @@ import { render, WithRoute } from 'lib/testHelpers'; import { clusterConnectConnectorTasksPath } from 'lib/paths'; import Tasks from 'components/Connect/Details/Tasks/Tasks'; import { tasks } from 'lib/fixtures/kafkaConnect'; -import { screen } from '@testing-library/dom'; -import { useConnectorTasks } from 'lib/hooks/api/kafkaConnect'; +import { screen, within, waitFor } from '@testing-library/react'; +import userEvent from '@testing-library/user-event'; +import { + useConnectorTasks, + useRestartConnectorTask, +} from 'lib/hooks/api/kafkaConnect'; +import { Task } from 'generated-sources'; jest.mock('lib/hooks/api/kafkaConnect', () => ({ useConnectorTasks: jest.fn(), @@ -13,30 +18,109 @@ jest.mock('lib/hooks/api/kafkaConnect', () => ({ const path = clusterConnectConnectorTasksPath('local', 'ghp', '1'); +const restartConnectorMock = jest.fn(); + describe('Tasks', () => { - const renderComponent = () => + beforeEach(() => { + (useRestartConnectorTask as jest.Mock).mockImplementation(() => ({ + mutateAsync: restartConnectorMock, + })); + }); + + const renderComponent = (currentData: Task[] | undefined = undefined) => { + (useConnectorTasks as jest.Mock).mockImplementation(() => ({ + data: currentData, + })); + render( <WithRoute path={clusterConnectConnectorTasksPath()}> <Tasks /> </WithRoute>, { initialEntries: [path] } ); + }; it('renders empty table', () => { - (useConnectorTasks as jest.Mock).mockImplementation(() => ({ - data: [], - })); - renderComponent(); expect(screen.getByRole('table')).toBeInTheDocument(); expect(screen.getByText('No tasks found')).toBeInTheDocument(); }); it('renders tasks table', () => { - (useConnectorTasks as jest.Mock).mockImplementation(() => ({ - data: tasks, - })); - renderComponent(); + renderComponent(tasks); expect(screen.getAllByRole('row').length).toEqual(tasks.length + 1); + + expect( + screen.getByRole('row', { + name: '1 kafka-connect0:8083 RUNNING', + }) + ).toBeInTheDocument(); + }); + + it('renders truncates long trace and expands', () => { + renderComponent(tasks); + + const trace = tasks[2]?.status?.trace || ''; + const truncatedTrace = trace.toString().substring(0, 100 - 3); + + const thirdRow = screen.getByRole('row', { + name: `3 kafka-connect0:8083 RUNNING ${truncatedTrace}...`, + }); + expect(thirdRow).toBeInTheDocument(); + + const expandedDetails = screen.queryByText(trace); + // Full trace is not visible + expect(expandedDetails).not.toBeInTheDocument(); + + userEvent.click(thirdRow); + + expect( + screen.getByRole('row', { + name: trace, + }) + ).toBeInTheDocument(); + }); + + describe('Action button', () => { + const expectDropdownExists = () => { + const firstTaskRow = screen.getByRole('row', { + name: '1 kafka-connect0:8083 RUNNING', + }); + expect(firstTaskRow).toBeInTheDocument(); + const extBtn = within(firstTaskRow).getByRole('button', { + name: 'Dropdown Toggle', + }); + expect(extBtn).toBeEnabled(); + userEvent.click(extBtn); + expect(screen.getByRole('menu')).toBeInTheDocument(); + }; + + it('renders action button', () => { + renderComponent(tasks); + expectDropdownExists(); + expect( + screen.getAllByRole('button', { name: 'Dropdown Toggle' }).length + ).toEqual(tasks.length); + // Action buttons are enabled + const actionBtn = screen.getAllByRole('menuitem'); + expect(actionBtn[0]).toHaveTextContent('Restart task'); + }); + + it('works as expected', async () => { + renderComponent(tasks); + expectDropdownExists(); + const actionBtn = screen.getAllByRole('menuitem'); + expect(actionBtn[0]).toHaveTextContent('Restart task'); + + userEvent.click(actionBtn[0]); + expect( + screen.getByText('Are you sure you want to restart the task?') + ).toBeInTheDocument(); + + expect(screen.getByText('Confirm the action')).toBeInTheDocument(); + userEvent.click(screen.getByRole('button', { name: 'Confirm' })); + + await waitFor(() => expect(restartConnectorMock).toHaveBeenCalled()); + }); }); }); diff --git a/kafka-ui-react-app/src/components/common/NewTable/ExpanderCell.tsx b/kafka-ui-react-app/src/components/common/NewTable/ExpanderCell.tsx index d53c3d79028..ff5e501d527 100644 --- a/kafka-ui-react-app/src/components/common/NewTable/ExpanderCell.tsx +++ b/kafka-ui-react-app/src/components/common/NewTable/ExpanderCell.tsx @@ -12,6 +12,7 @@ const ExpanderCell: React.FC<CellContext<unknown, unknown>> = ({ row }) => ( xmlns="http://www.w3.org/2000/svg" role="button" aria-label="Expand row" + $disabled={!row.getCanExpand()} > {row.getIsExpanded() ? ( <path diff --git a/kafka-ui-react-app/src/components/common/NewTable/Table.styled.ts b/kafka-ui-react-app/src/components/common/NewTable/Table.styled.ts index 5fc02176ee5..75671beca90 100644 --- a/kafka-ui-react-app/src/components/common/NewTable/Table.styled.ts +++ b/kafka-ui-react-app/src/components/common/NewTable/Table.styled.ts @@ -1,14 +1,15 @@ -import styled from 'styled-components'; +import styled, { css } from 'styled-components'; -export const ExpaderButton = styled.svg( - ({ theme: { table } }) => ` - & > path { - fill: ${table.expander.normal}; - &:hover { - fill: ${table.expander.hover}; +export const ExpaderButton = styled.svg<{ $disabled: boolean }>( + ({ theme: { table }, $disabled }) => css` + & > path { + fill: ${table.expander[$disabled ? 'disabled' : 'normal']}; } - } -` + + &:hover > path { + fill: ${table.expander[$disabled ? 'disabled' : 'hover']}; + } + ` ); interface ThProps { diff --git a/kafka-ui-react-app/src/components/common/NewTable/Table.tsx b/kafka-ui-react-app/src/components/common/NewTable/Table.tsx index aaf02ea4082..3b3a780674c 100644 --- a/kafka-ui-react-app/src/components/common/NewTable/Table.tsx +++ b/kafka-ui-react-app/src/components/common/NewTable/Table.tsx @@ -246,15 +246,15 @@ const Table: React.FC<TableProps<any>> = ({ } > {!!enableRowSelection && ( - <td key={`${row.id}-select`}> + <td key={`${row.id}-select`} style={{ width: '1px' }}> {flexRender( SelectRowCell, row.getVisibleCells()[0].getContext() )} </td> )} - {row.getCanExpand() && ( - <td key={`${row.id}-expander`}> + {table.getCanSomeRowsExpand() && ( + <td key={`${row.id}-expander`} style={{ width: '1px' }}> {flexRender( ExpanderCell, row.getVisibleCells()[0].getContext() @@ -264,7 +264,9 @@ const Table: React.FC<TableProps<any>> = ({ {row .getVisibleCells() .map(({ id, getContext, column: { columnDef } }) => ( - <td key={id}>{flexRender(columnDef.cell, getContext())}</td> + <td key={id} style={columnDef.meta}> + {flexRender(columnDef.cell, getContext())} + </td> ))} </S.Row> {row.getIsExpanded() && renderSubComponent && ( diff --git a/kafka-ui-react-app/src/lib/fixtures/kafkaConnect.ts b/kafka-ui-react-app/src/lib/fixtures/kafkaConnect.ts index f885ea405df..8a79760e661 100644 --- a/kafka-ui-react-app/src/lib/fixtures/kafkaConnect.ts +++ b/kafka-ui-react-app/src/lib/fixtures/kafkaConnect.ts @@ -93,6 +93,8 @@ export const tasks: Task[] = [ id: 3, state: ConnectorTaskStatus.RUNNING, workerId: 'kafka-connect0:8083', + trace: + 'Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.', }, config: { 'batch.size': '3000', diff --git a/kafka-ui-react-app/src/theme/theme.ts b/kafka-ui-react-app/src/theme/theme.ts index 4c057645ef1..a962b2a5356 100644 --- a/kafka-ui-react-app/src/theme/theme.ts +++ b/kafka-ui-react-app/src/theme/theme.ts @@ -346,6 +346,7 @@ const theme = { expander: { normal: Colors.brand[50], hover: Colors.brand[20], + disabled: Colors.neutral[10], }, }, primaryTab: {
null
train
train
2022-08-18T10:16:00
"2022-07-22T13:46:18Z"
Haarolean
train
provectus/kafka-ui/2261_2469
provectus/kafka-ui
provectus/kafka-ui/2261
provectus/kafka-ui/2469
[ "keyword_pr_to_issue", "timestamp(timedelta=0.0, similarity=0.9109802871686771)" ]
5ac52efb7a2ec64a38110893ef0c35152a6b501b
6de2eaeab1f80ac71d63896856c93193ef5cd54e
[]
[]
"2022-08-18T19:10:52Z"
[ "good first issue", "scope/frontend", "status/accepted", "status/confirmed", "type/chore" ]
"Value" field is focused with adding the Custom Parameter in a Topic
**Describe the bug** "Value" field is focused with adding the Custom Parameter in a Topic **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** 1. Navigate to Topics 2. Add a Topic 3. Press "Add Custom Parameter" **Expected behavior** Value field shouldn’t be focused **Screenshots** https://user-images.githubusercontent.com/104780608/178551325-f86dbc4e-166d-4131-a4ec-f89b5f950838.mov
[ "kafka-ui-react-app/src/components/Topics/shared/Form/CustomParams/CustomParams.tsx" ]
[ "kafka-ui-react-app/src/components/Topics/shared/Form/CustomParams/CustomParams.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/Topics/shared/Form/CustomParams/CustomParams.tsx b/kafka-ui-react-app/src/components/Topics/shared/Form/CustomParams/CustomParams.tsx index 38d95f56c8b..a85b4a2fc4d 100644 --- a/kafka-ui-react-app/src/components/Topics/shared/Form/CustomParams/CustomParams.tsx +++ b/kafka-ui-react-app/src/components/Topics/shared/Form/CustomParams/CustomParams.tsx @@ -57,7 +57,9 @@ const CustomParams: React.FC<CustomParamsProps> = ({ isSubmitting }) => { type="button" buttonSize="M" buttonType="secondary" - onClick={() => append({ name: '', value: '' })} + onClick={() => + append({ name: '', value: '' }, { shouldFocus: false }) + } > <PlusIcon /> Add Custom Parameter
null
train
train
2022-08-22T19:04:22
"2022-07-12T17:07:22Z"
armenuikafka
train
provectus/kafka-ui/2431_2470
provectus/kafka-ui
provectus/kafka-ui/2431
provectus/kafka-ui/2470
[ "timestamp(timedelta=1.0, similarity=0.9568163863211588)", "connected" ]
d63c25e3173de78b34426af19416d72035fdcdf7
c0d64d7c566abfdd3ceeaa9c09cd895f0fae3f60
[]
[ "no need to place here Objects.requireNonNull because getResourceAsString() doesn't have return null option now", "no need to place here Objects.requireNonNull because getResourceAsString() doesn't have return null option now", "please use import static FileUtils.* to reduce mentions the same class many times per method", "use import static pls", "Reverted them ", "Reverted", "Done it ", "Done it \r\n", "do we really need () here?", "Done " ]
"2022-08-19T10:39:47Z"
[ "scope/QA", "type/refactoring", "status/accepted" ]
[e2e] get rid of SheakyThrows annotation
in the scope of this issue need to remove annotation from the high-level methods and implement exception handling on the lowest level
[ "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/extensions/FileUtils.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/MainPage.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorCreateView.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorsList.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaRegistryList.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicView.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicsList.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/screenshots/Screenshooter.java" ]
[ "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/extensions/FileUtils.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/MainPage.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorCreateView.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorsList.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaRegistryList.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicView.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicsList.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/screenshots/Screenshooter.java" ]
[ "kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/SmokeTests.java", "kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/ConnectorsTests.java", "kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/SchemasTests.java", "kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/TopicTests.java" ]
diff --git a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/extensions/FileUtils.java b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/extensions/FileUtils.java index 22926226403..bbe4dbf0b4d 100644 --- a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/extensions/FileUtils.java +++ b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/extensions/FileUtils.java @@ -5,11 +5,23 @@ import java.io.IOException; import java.nio.charset.StandardCharsets; +import static org.apache.kafka.common.utils.Utils.readFileAsString; public class FileUtils { - public static String getResourceAsString(String resourceFileName) throws IOException { - return IOUtils.resourceToString("/" + resourceFileName, StandardCharsets.UTF_8); + public static String getResourceAsString(String resourceFileName) { + try { + return IOUtils.resourceToString("/" + resourceFileName, StandardCharsets.UTF_8); + } catch (IOException e) { + throw new RuntimeException(e); + } } + public static String fileToString(String path) { + try { + return readFileAsString(path); + } catch (IOException e) { + throw new RuntimeException(e); + } + } } diff --git a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/MainPage.java b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/MainPage.java index 515ab60ffee..cfc15f105ec 100644 --- a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/MainPage.java +++ b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/MainPage.java @@ -3,11 +3,10 @@ import com.codeborne.selenide.Condition; import com.codeborne.selenide.Selenide; import com.codeborne.selenide.SelenideElement; -import com.provectus.kafka.ui.helpers.TestConfiguration; import com.provectus.kafka.ui.extensions.WaitUtils; +import com.provectus.kafka.ui.helpers.TestConfiguration; import com.provectus.kafka.ui.pages.topic.TopicsList; import io.qameta.allure.Step; -import lombok.SneakyThrows; import lombok.experimental.ExtensionMethod; import org.openqa.selenium.By; @@ -32,7 +31,6 @@ public MainPage waitUntilScreenReady() { return this; } - @SneakyThrows @Step public void topicIsVisible(String topicName) { new TopicsList().isTopicVisible(topicName); @@ -43,18 +41,18 @@ public void topicIsNotVisible(String topicName){ } - public enum SideMenuOptions { - BROKERS("Brokers"), - TOPICS("Topics"), - CONSUMERS("Consumers"), - SCHEMA_REGISTRY("Schema Registry"); + public enum SideMenuOptions { + BROKERS("Brokers"), + TOPICS("Topics"), + CONSUMERS("Consumers"), + SCHEMA_REGISTRY("Schema Registry"); - final String value; + final String value; - SideMenuOptions(String value) { - this.value = value; + SideMenuOptions(String value) { + this.value = value; + } } - } diff --git a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorCreateView.java b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorCreateView.java index 69f67a9a4af..a12aca59170 100644 --- a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorCreateView.java +++ b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorCreateView.java @@ -2,15 +2,15 @@ import com.codeborne.selenide.Condition; import com.codeborne.selenide.SelenideElement; -import com.provectus.kafka.ui.utils.BrowserUtils; import com.provectus.kafka.ui.extensions.WaitUtils; +import com.provectus.kafka.ui.utils.BrowserUtils; import io.qameta.allure.Step; import lombok.experimental.ExtensionMethod; import org.openqa.selenium.By; import static com.codeborne.selenide.Selenide.$; +import static com.codeborne.selenide.Selenide.sleep; import static com.provectus.kafka.ui.screenshots.Screenshooter.log; -import static java.lang.Thread.sleep; @ExtensionMethod(WaitUtils.class) public class ConnectorCreateView { @@ -22,7 +22,7 @@ public class ConnectorCreateView { private static final String path = "/ui/clusters/secondLocal/connectors/create_new"; @Step("Set connector config JSON") - public ConnectorsView setConnectorConfig(String connectName, String configJson) throws InterruptedException { + public ConnectorsView setConnectorConfig(String connectName, String configJson) { nameField.setValue(connectName); $("#config").click(); contentTextArea.setValue(""); diff --git a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorsList.java b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorsList.java index 659a87da2c8..f79f2327765 100644 --- a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorsList.java +++ b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorsList.java @@ -2,11 +2,10 @@ import com.codeborne.selenide.Condition; import com.codeborne.selenide.Selenide; +import com.provectus.kafka.ui.extensions.WaitUtils; import com.provectus.kafka.ui.helpers.TestConfiguration; import com.provectus.kafka.ui.utils.BrowserUtils; -import com.provectus.kafka.ui.extensions.WaitUtils; import io.qameta.allure.Step; -import lombok.SneakyThrows; import lombok.experimental.ExtensionMethod; import org.openqa.selenium.By; @@ -35,14 +34,14 @@ public ConnectorCreateView clickCreateConnectorButton() { return new ConnectorCreateView(); } - @SneakyThrows + @Step public ConnectorsList openConnector(String connectorName) { $(By.linkText(connectorName)).click(); return this; } - @SneakyThrows + @Step public ConnectorsList isNotVisible(String connectorName) { $(By.xpath("//table")).shouldBe(Condition.visible); @@ -52,8 +51,8 @@ public ConnectorsList isNotVisible(String connectorName) { @Step("Verify that connector {connectorName} is visible in the list") public ConnectorsList connectorIsVisibleInList(String connectorName, String topicName) { - $x("//table//a[@href='/ui/clusters/local/connects/first/connectors/" + connectorName +"']").shouldBe(Condition.visible); - $$(By.linkText(topicName)); + $x("//table//a[@href='/ui/clusters/local/connects/first/connectors/" + connectorName + "']").shouldBe(Condition.visible); + $$(By.linkText(topicName)); return this; } @Step diff --git a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaRegistryList.java b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaRegistryList.java index 34ee71c7ffc..a9cc45d92c9 100644 --- a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaRegistryList.java +++ b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaRegistryList.java @@ -4,7 +4,6 @@ import com.codeborne.selenide.SelenideElement; import com.provectus.kafka.ui.utils.BrowserUtils; import io.qameta.allure.Step; -import lombok.SneakyThrows; import org.openqa.selenium.By; import static com.codeborne.selenide.Selenide.*; @@ -23,10 +22,9 @@ public SchemaView openSchema(String schemaName) { return new SchemaView(); } - @SneakyThrows @Step public SchemaRegistryList isNotVisible(String schemaName) { - $x(String.format("//*[contains(text(),'%s')]",schemaName)).shouldNotBe(Condition.visible); + $x(String.format("//*[contains(text(),'%s')]", schemaName)).shouldNotBe(Condition.visible); return this; } diff --git a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicView.java b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicView.java index c3607fceab4..7eccf23fe21 100644 --- a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicView.java +++ b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicView.java @@ -3,12 +3,11 @@ import com.codeborne.selenide.Condition; import com.codeborne.selenide.Selenide; import com.codeborne.selenide.SelenideElement; -import com.provectus.kafka.ui.helpers.TestConfiguration; import com.provectus.kafka.ui.extensions.WaitUtils; +import com.provectus.kafka.ui.helpers.TestConfiguration; import com.provectus.kafka.ui.pages.ProduceMessagePage; import com.provectus.kafka.ui.utils.BrowserUtils; import io.qameta.allure.Step; -import lombok.SneakyThrows; import lombok.experimental.ExtensionMethod; import org.openqa.selenium.By; @@ -34,7 +33,6 @@ public TopicView waitUntilScreenReady() { return this; } - @SneakyThrows @Step public TopicCreateEditSettingsView openEditSettings() { BrowserUtils.javaExecutorClick(dotMenuHeader); @@ -48,7 +46,6 @@ public TopicView openTopicMenu(TopicMenu menu) { return this; } - @SneakyThrows @Step public TopicsList deleteTopic() { BrowserUtils.javaExecutorClick(dotMenuHeader); @@ -57,7 +54,6 @@ public TopicsList deleteTopic() { return new TopicsList(); } - @SneakyThrows @Step public ProduceMessagePage clickOnButton(String buttonName) { BrowserUtils.javaExecutorClick($(By.xpath(String.format("//div//button[text()='%s']", buttonName)))); diff --git a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicsList.java b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicsList.java index c39a3ca1b96..bbcb67effb0 100644 --- a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicsList.java +++ b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/topic/TopicsList.java @@ -3,11 +3,10 @@ import com.codeborne.selenide.CollectionCondition; import com.codeborne.selenide.Condition; import com.codeborne.selenide.Selenide; -import com.provectus.kafka.ui.helpers.TestConfiguration; import com.provectus.kafka.ui.extensions.WaitUtils; +import com.provectus.kafka.ui.helpers.TestConfiguration; import com.provectus.kafka.ui.utils.BrowserUtils; import io.qameta.allure.Step; -import lombok.SneakyThrows; import lombok.experimental.ExtensionMethod; import org.openqa.selenium.By; @@ -32,7 +31,7 @@ public TopicsList waitUntilScreenReady() { } @Step - public TopicCreateEditSettingsView pressCreateNewTopic(){ + public TopicCreateEditSettingsView pressCreateNewTopic() { BrowserUtils.javaExecutorClick($x("//button[normalize-space(text()) ='Add a Topic']")); return new TopicCreateEditSettingsView(); } @@ -46,14 +45,12 @@ public TopicsList isTopicVisible(String topicName) { return this; } - @SneakyThrows @Step public TopicView openTopic(String topicName) { $(By.linkText(topicName)).click(); return new TopicView(); } - @SneakyThrows @Step public TopicsList isTopicNotVisible(String topicName) { $$x("//table/tbody/tr/td[2]") diff --git a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/screenshots/Screenshooter.java b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/screenshots/Screenshooter.java index 66b016ba85e..d412050b299 100644 --- a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/screenshots/Screenshooter.java +++ b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/screenshots/Screenshooter.java @@ -16,6 +16,7 @@ import java.awt.image.BufferedImage; import java.io.ByteArrayOutputStream; import java.io.File; +import java.io.IOException; import java.nio.file.FileSystems; import java.util.List; @@ -23,131 +24,139 @@ public class Screenshooter { - public static Logger log = LoggerFactory.getLogger(Screenshooter.class); - - private static final int PIXELS_THRESHOLD = - Integer.parseInt(System.getProperty("PIXELS_THRESHOLD", "200")); - private static final String SCREENSHOTS_FOLDER = - System.getProperty("SCREENSHOTS_FOLDER", "com/provectus/kafka/ui/screenshots/"); - private static final String DIFF_SCREENSHOTS_FOLDER = - System.getProperty("DIFF_SCREENSHOTS_FOLDER", "build/__diff__/"); - private static final String ACTUAL_SCREENSHOTS_FOLDER = - System.getProperty("ACTUAL_SCREENSHOTS_FOLDER", "build/__actual__/"); - private static final boolean SHOULD_SAVE_SCREENSHOTS_IF_NOT_EXIST = - Boolean.parseBoolean(System.getProperty("SHOULD_SAVE_SCREENSHOTS_IF_NOT_EXIST", "true")); - private static final boolean TURN_OFF_SCREENSHOTS = - Boolean.parseBoolean(System.getProperty("TURN_OFF_SCREENSHOTS", "false")); - private static final boolean USE_LOCAL_BROWSER = - Boolean.parseBoolean(System.getProperty("USE_LOCAL_BROWSER", "false")); - - private File newFile(String name) { - var file = new File(name); - if (!file.exists()) { - file.mkdirs(); + public static Logger log = LoggerFactory.getLogger(Screenshooter.class); + + private static final int PIXELS_THRESHOLD = + Integer.parseInt(System.getProperty("PIXELS_THRESHOLD", "200")); + private static final String SCREENSHOTS_FOLDER = + System.getProperty("SCREENSHOTS_FOLDER", "com/provectus/kafka/ui/screenshots/"); + private static final String DIFF_SCREENSHOTS_FOLDER = + System.getProperty("DIFF_SCREENSHOTS_FOLDER", "build/__diff__/"); + private static final String ACTUAL_SCREENSHOTS_FOLDER = + System.getProperty("ACTUAL_SCREENSHOTS_FOLDER", "build/__actual__/"); + private static final boolean SHOULD_SAVE_SCREENSHOTS_IF_NOT_EXIST = + Boolean.parseBoolean(System.getProperty("SHOULD_SAVE_SCREENSHOTS_IF_NOT_EXIST", "true")); + private static final boolean TURN_OFF_SCREENSHOTS = + Boolean.parseBoolean(System.getProperty("TURN_OFF_SCREENSHOTS", "false")); + private static final boolean USE_LOCAL_BROWSER = + Boolean.parseBoolean(System.getProperty("USE_LOCAL_BROWSER", "false")); + + private File newFile(String name) { + var file = new File(name); + if (!file.exists()) { + file.mkdirs(); + } + return file; } - return file; - } - - public Screenshooter() { - List.of(SCREENSHOTS_FOLDER, DIFF_SCREENSHOTS_FOLDER, ACTUAL_SCREENSHOTS_FOLDER) - .forEach(this::newFile); - } - - public void compareScreenshots(String name) { - compareScreenshots(name, false); - } - - @SneakyThrows - public void compareScreenshots(String name, boolean shouldUpdateScreenshotIfDiffer) { - if (TURN_OFF_SCREENSHOTS || USE_LOCAL_BROWSER) { - log.warn(String.format("compareScreenshots turned off due TURN_OFF_SCREENSHOTS || USE_LOCAL_BROWSER: %b || %b" - , TURN_OFF_SCREENSHOTS,USE_LOCAL_BROWSER)); - return; + + public Screenshooter() { + List.of(SCREENSHOTS_FOLDER, DIFF_SCREENSHOTS_FOLDER, ACTUAL_SCREENSHOTS_FOLDER) + .forEach(this::newFile); + } + + public void compareScreenshots(String name) { + compareScreenshots(name, false); + } + + + public void compareScreenshots(String name, boolean shouldUpdateScreenshotIfDiffer) { + if (TURN_OFF_SCREENSHOTS || USE_LOCAL_BROWSER) { + log.warn(String.format("compareScreenshots turned off due TURN_OFF_SCREENSHOTS || USE_LOCAL_BROWSER: %b || %b" + , TURN_OFF_SCREENSHOTS, USE_LOCAL_BROWSER)); + return; + } + if (!doesScreenshotExist(name)) { + if (SHOULD_SAVE_SCREENSHOTS_IF_NOT_EXIST) { + updateActualScreenshot(name); + } else { + try { + throw new NoReferenceScreenshotFoundException(name); + } catch (NoReferenceScreenshotFoundException e) { + e.printStackTrace(); + } + } + } else { + makeImageDiff(name, shouldUpdateScreenshotIfDiffer); + } + } + + + private void updateActualScreenshot(String name) { + Screenshot actual = + new AShot().coordsProvider(new WebDriverCoordsProvider()).takeScreenshot(getWebDriver()); + File file = newFile(SCREENSHOTS_FOLDER + name + ".png"); + try { + ImageIO.write(actual.getImage(), "png", file); + } catch (IOException e) { + e.printStackTrace(); + } + log.debug(String.format("created screenshot: %s \n at %s", name, file.getAbsolutePath())); + } + + private static boolean doesScreenshotExist(String name) { + return new File(SCREENSHOTS_FOLDER + name + ".png").exists(); } - if (!doesScreenshotExist(name)) { - if (SHOULD_SAVE_SCREENSHOTS_IF_NOT_EXIST) { - updateActualScreenshot(name); - } else { - throw new NoReferenceScreenshotFoundException(name); - } - } else { - makeImageDiff(name, shouldUpdateScreenshotIfDiffer); + + @SneakyThrows + private void makeImageDiff(String expectedName, boolean shouldUpdateScreenshotIfDiffer) { + String fullPathNameExpected = SCREENSHOTS_FOLDER + expectedName + ".png"; + String fullPathNameActual = ACTUAL_SCREENSHOTS_FOLDER + expectedName + ".png"; + String fullPathNameDiff = DIFF_SCREENSHOTS_FOLDER + expectedName + ".png"; + + // activating allure plugin for showing diffs in report + Allure.label("testType", "screenshotDiff"); + + Screenshot actual = + new AShot().coordsProvider(new WebDriverCoordsProvider()).takeScreenshot(getWebDriver()); + ImageIO.write(actual.getImage(), "png", newFile(fullPathNameActual)); + + Screenshot expected = new Screenshot(ImageIO.read(newFile(fullPathNameExpected))); + ImageDiff diff = new ImageDiffer().makeDiff(actual, expected); + BufferedImage diffImage = diff.getMarkedImage(); + ImageIO.write(diffImage, "png", newFile(fullPathNameDiff)); + // adding to report + diff(fullPathNameDiff); + // adding to report + actual(fullPathNameActual); + // adding to report + expected(fullPathNameExpected); + + if (shouldUpdateScreenshotIfDiffer) { + if (diff.getDiffSize() > PIXELS_THRESHOLD) { + updateActualScreenshot(expectedName); + } + } else { + Assertions.assertTrue( + PIXELS_THRESHOLD >= diff.getDiffSize(), + String.format("Amount of differing pixels should be less or equals than %s, actual %s\n" + + "diff file: %s", + PIXELS_THRESHOLD, diff.getDiffSize(), FileSystems.getDefault().getPath(fullPathNameDiff).normalize().toAbsolutePath())); + } + } + + @SneakyThrows + private byte[] imgToBytes(String filename) { + BufferedImage bImage2 = ImageIO.read(new File(filename)); + var bos2 = new ByteArrayOutputStream(); + ImageIO.write(bImage2, "png", bos2); + return bos2.toByteArray(); + } + + + @Attachment + private byte[] actual(String actualFileName) { + return imgToBytes(actualFileName); } - } - - @SneakyThrows - private void updateActualScreenshot(String name) { - Screenshot actual = - new AShot().coordsProvider(new WebDriverCoordsProvider()).takeScreenshot(getWebDriver()); - File file= newFile(SCREENSHOTS_FOLDER + name + ".png"); - ImageIO.write(actual.getImage(), "png", file); - log.debug(String.format("created screenshot: %s \n at %s", name, file.getAbsolutePath())); - } - - private static boolean doesScreenshotExist(String name) { - return new File(SCREENSHOTS_FOLDER + name + ".png").exists(); - } - - @SneakyThrows - private void makeImageDiff(String expectedName, boolean shouldUpdateScreenshotIfDiffer) { - String fullPathNameExpected = SCREENSHOTS_FOLDER + expectedName + ".png"; - String fullPathNameActual = ACTUAL_SCREENSHOTS_FOLDER + expectedName + ".png"; - String fullPathNameDiff = DIFF_SCREENSHOTS_FOLDER + expectedName + ".png"; - - // activating allure plugin for showing diffs in report - Allure.label("testType", "screenshotDiff"); - - Screenshot actual = - new AShot().coordsProvider(new WebDriverCoordsProvider()).takeScreenshot(getWebDriver()); - ImageIO.write(actual.getImage(), "png", newFile(fullPathNameActual)); - - Screenshot expected = new Screenshot(ImageIO.read(newFile(fullPathNameExpected))); - ImageDiff diff = new ImageDiffer().makeDiff(actual, expected); - BufferedImage diffImage = diff.getMarkedImage(); - ImageIO.write(diffImage, "png", newFile(fullPathNameDiff)); - // adding to report - diff(fullPathNameDiff); - // adding to report - actual(fullPathNameActual); - // adding to report - expected(fullPathNameExpected); - - if (shouldUpdateScreenshotIfDiffer) { - if (diff.getDiffSize() > PIXELS_THRESHOLD) { - updateActualScreenshot(expectedName); - } - } else { - Assertions.assertTrue( - PIXELS_THRESHOLD >= diff.getDiffSize(), - String.format("Amount of differing pixels should be less or equals than %s, actual %s\n"+ - "diff file: %s", - PIXELS_THRESHOLD, diff.getDiffSize(), FileSystems.getDefault().getPath(fullPathNameDiff).normalize().toAbsolutePath())); + + + @Attachment + private byte[] expected(String expectedFileName) { + return imgToBytes(expectedFileName); + } + + + @Attachment + private byte[] diff(String diffFileName) { + return imgToBytes(diffFileName); } - } - - @SneakyThrows - private byte[] imgToBytes(String filename) { - BufferedImage bImage2 = ImageIO.read(new File(filename)); - var bos2 = new ByteArrayOutputStream(); - ImageIO.write(bImage2, "png", bos2); - return bos2.toByteArray(); - } - - @SneakyThrows - @Attachment - private byte[] actual(String actualFileName) { - return imgToBytes(actualFileName); - } - - @SneakyThrows - @Attachment - private byte[] expected(String expectedFileName) { - return imgToBytes(expectedFileName); - } - - @SneakyThrows - @Attachment - private byte[] diff(String diffFileName) { - return imgToBytes(diffFileName); - } }
diff --git a/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/SmokeTests.java b/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/SmokeTests.java index 149acf4aa56..fbfb0990eb9 100644 --- a/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/SmokeTests.java +++ b/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/SmokeTests.java @@ -4,7 +4,6 @@ import com.provectus.kafka.ui.utils.qaseIO.Status; import com.provectus.kafka.ui.utils.qaseIO.annotation.AutomationStatus; import io.qase.api.annotation.CaseId; -import lombok.SneakyThrows; import org.junit.jupiter.api.DisplayName; import org.junit.jupiter.api.Test; @@ -12,7 +11,6 @@ public class SmokeTests extends BaseTest { @Test @AutomationStatus(status = Status.AUTOMATED) @CaseId(198) - @SneakyThrows @DisplayName("main page should load") void mainPageLoads() { pages.open() diff --git a/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/ConnectorsTests.java b/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/ConnectorsTests.java index fe0e468e45f..379114fdd84 100644 --- a/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/ConnectorsTests.java +++ b/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/ConnectorsTests.java @@ -1,18 +1,18 @@ package com.provectus.kafka.ui.tests; import com.provectus.kafka.ui.base.BaseTest; -import com.provectus.kafka.ui.extensions.FileUtils; import com.provectus.kafka.ui.helpers.ApiHelper; import com.provectus.kafka.ui.helpers.Helpers; import com.provectus.kafka.ui.utils.qaseIO.Status; import com.provectus.kafka.ui.utils.qaseIO.annotation.AutomationStatus; +import com.provectus.kafka.ui.utils.qaseIO.annotation.Suite; import io.qase.api.annotation.CaseId; -import lombok.SneakyThrows; import org.junit.jupiter.api.AfterAll; import org.junit.jupiter.api.BeforeAll; import org.junit.jupiter.api.DisplayName; import org.junit.jupiter.api.Test; -import com.provectus.kafka.ui.utils.qaseIO.annotation.Suite; + +import static com.provectus.kafka.ui.extensions.FileUtils.getResourceAsString; public class ConnectorsTests extends BaseTest { @@ -28,13 +28,12 @@ public class ConnectorsTests extends BaseTest { public static final String CONNECTOR_FOR_UPDATE = "sink_postgres_activities_e2e_checks_for_update"; @BeforeAll - @SneakyThrows public static void beforeAll() { ApiHelper apiHelper = Helpers.INSTANCE.apiHelper; - String connectorToDelete = FileUtils.getResourceAsString("delete_connector_config.json"); - String connectorToUpdate = FileUtils.getResourceAsString("config_for_create_connector_via_api.json"); - String message = FileUtils.getResourceAsString("message_content_create_topic.json"); + String connectorToDelete = getResourceAsString("delete_connector_config.json"); + String connectorToUpdate = getResourceAsString("config_for_create_connector_via_api.json"); + String message = getResourceAsString("message_content_create_topic.json"); apiHelper.deleteTopic(LOCAL_CLUSTER, CONNECTOR_FOR_DELETE); @@ -52,7 +51,6 @@ public static void beforeAll() { } @AfterAll - @SneakyThrows public static void afterAll() { ApiHelper apiHelper = Helpers.INSTANCE.apiHelper; apiHelper.deleteConnector(LOCAL_CLUSTER, FIRST_CONNECTOR, SINK_CONNECTOR); @@ -62,7 +60,6 @@ public static void afterAll() { apiHelper.deleteTopic(LOCAL_CLUSTER, TOPIC_FOR_UPDATE_CONNECTOR); } - @SneakyThrows @DisplayName("should create a connector") @Suite(suiteId = suiteId, title = suiteTitle) @AutomationStatus(status = Status.AUTOMATED) @@ -75,13 +72,12 @@ public void createConnector() { .waitUntilScreenReady() .setConnectorConfig( SINK_CONNECTOR, - FileUtils.getResourceAsString("config_for_create_connector.json")); + getResourceAsString("config_for_create_connector.json")); pages.openConnectorsList(LOCAL_CLUSTER) .waitUntilScreenReady() .connectorIsVisibleInList(SINK_CONNECTOR, TOPIC_FOR_CONNECTOR); } - @SneakyThrows @DisplayName("should update a connector") @Suite(suiteId = suiteId, title = suiteTitle) @AutomationStatus(status = Status.AUTOMATED) @@ -93,12 +89,11 @@ public void updateConnector() { .openConnector(CONNECTOR_FOR_UPDATE); pages.connectorsView.connectorIsVisibleOnOverview(); pages.connectorsView.openEditConfig() - .updConnectorConfig(FileUtils.getResourceAsString("config_for_update_connector.json")); + .updConnectorConfig(getResourceAsString("config_for_update_connector.json")); pages.openConnectorsList(LOCAL_CLUSTER) .connectorIsVisibleInList(CONNECTOR_FOR_UPDATE, TOPIC_FOR_UPDATE_CONNECTOR); } - @SneakyThrows @DisplayName("should delete connector") @Suite(suiteId = suiteId, title = suiteTitle) @AutomationStatus(status = Status.AUTOMATED) diff --git a/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/SchemasTests.java b/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/SchemasTests.java index af14f108198..239f66f5cf5 100644 --- a/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/SchemasTests.java +++ b/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/SchemasTests.java @@ -8,14 +8,11 @@ import com.provectus.kafka.ui.pages.schema.SchemaCreateView; import com.provectus.kafka.ui.utils.qaseIO.Status; import com.provectus.kafka.ui.utils.qaseIO.annotation.AutomationStatus; +import com.provectus.kafka.ui.utils.qaseIO.annotation.Suite; import io.qase.api.annotation.CaseId; -import lombok.SneakyThrows; import org.junit.jupiter.api.*; -import com.provectus.kafka.ui.utils.qaseIO.annotation.Suite; - -import java.io.IOException; -import static org.apache.kafka.common.utils.Utils.readFileAsString; +import static com.provectus.kafka.ui.extensions.FileUtils.fileToString; @TestMethodOrder(MethodOrderer.OrderAnnotation.class) public class SchemasTests extends BaseTest { @@ -36,16 +33,14 @@ public class SchemasTests extends BaseTest { private static final String PATH_JSON_VALUE = System.getProperty("user.dir") + "/src/test/resources/schema_Json_Value.json"; @BeforeAll - @SneakyThrows public static void beforeAll() { - Helpers.INSTANCE.apiHelper.createSchema(SECOND_LOCAL, SCHEMA_AVRO_API_UPDATE, SchemaType.AVRO, readFileAsString(PATH_AVRO_VALUE)); - Helpers.INSTANCE.apiHelper.createSchema(SECOND_LOCAL, SCHEMA_AVRO_API, SchemaType.AVRO, readFileAsString(PATH_AVRO_VALUE)); - Helpers.INSTANCE.apiHelper.createSchema(SECOND_LOCAL, SCHEMA_JSON_API, SchemaType.JSON, readFileAsString(PATH_JSON_VALUE)); - Helpers.INSTANCE.apiHelper.createSchema(SECOND_LOCAL, SCHEMA_PROTOBUF_API, SchemaType.PROTOBUF, readFileAsString(PATH_PROTOBUF_VALUE)); + Helpers.INSTANCE.apiHelper.createSchema(SECOND_LOCAL, SCHEMA_AVRO_API_UPDATE, SchemaType.AVRO, fileToString(PATH_AVRO_VALUE)); + Helpers.INSTANCE.apiHelper.createSchema(SECOND_LOCAL, SCHEMA_AVRO_API, SchemaType.AVRO, fileToString(PATH_AVRO_VALUE)); + Helpers.INSTANCE.apiHelper.createSchema(SECOND_LOCAL, SCHEMA_JSON_API, SchemaType.JSON, fileToString(PATH_JSON_VALUE)); + Helpers.INSTANCE.apiHelper.createSchema(SECOND_LOCAL, SCHEMA_PROTOBUF_API, SchemaType.PROTOBUF, fileToString(PATH_PROTOBUF_VALUE)); } @AfterAll - @SneakyThrows public static void afterAll() { Helpers.INSTANCE.apiHelper.deleteSchema(SECOND_LOCAL, SCHEMA_AVRO_CREATE); Helpers.INSTANCE.apiHelper.deleteSchema(SECOND_LOCAL, SCHEMA_JSON_CREATE); @@ -63,12 +58,12 @@ public static void afterAll() { @CaseId(43) @Test @Order(1) - void createSchemaAvro() throws IOException { + void createSchemaAvro() { pages.openMainPage() .goToSideMenu(SECOND_LOCAL, MainPage.SideMenuOptions.SCHEMA_REGISTRY); pages.schemaRegistry.clickCreateSchema() .setSubjectName(SCHEMA_AVRO_CREATE) - .setSchemaField(readFileAsString(PATH_AVRO_VALUE)) + .setSchemaField(fileToString(PATH_AVRO_VALUE)) .selectSchemaTypeFromDropdown(SchemaCreateView.SchemaType.AVRO) .clickSubmit() .waitUntilScreenReady(); @@ -77,7 +72,6 @@ void createSchemaAvro() throws IOException { pages.schemaRegistry.isSchemaVisible(SCHEMA_AVRO_CREATE); } - @SneakyThrows @DisplayName("should update AVRO schema") @Suite(suiteId = suiteId, title = suiteTitle) @AutomationStatus(status = Status.AUTOMATED) @@ -91,13 +85,12 @@ void updateSchemaAvro() { .waitUntilScreenReady() .openEditSchema() .selectCompatibilityLevelFromDropdown(CompatibilityLevel.CompatibilityEnum.NONE) - .setNewSchemaValue(readFileAsString(PATH_AVRO_FOR_UPDATE)) + .setNewSchemaValue(fileToString(PATH_AVRO_FOR_UPDATE)) .clickSubmit() .waitUntilScreenReady() .isCompatibility(CompatibilityLevel.CompatibilityEnum.NONE); } - @SneakyThrows @DisplayName("should delete AVRO schema") @Suite(suiteId = suiteId, title = suiteTitle) @AutomationStatus(status = Status.AUTOMATED) @@ -113,7 +106,6 @@ void deleteSchemaAvro() { .isNotVisible(SCHEMA_AVRO_API); } - @SneakyThrows @DisplayName("should create JSON schema") @Suite(suiteId = suiteId, title = suiteTitle) @AutomationStatus(status = Status.AUTOMATED) @@ -125,7 +117,7 @@ void createSchemaJson() { .goToSideMenu(SECOND_LOCAL, MainPage.SideMenuOptions.SCHEMA_REGISTRY); pages.schemaRegistry.clickCreateSchema() .setSubjectName(SCHEMA_JSON_CREATE) - .setSchemaField(readFileAsString(PATH_JSON_VALUE)) + .setSchemaField(fileToString(PATH_JSON_VALUE)) .selectSchemaTypeFromDropdown(SchemaCreateView.SchemaType.JSON) .clickSubmit() .waitUntilScreenReady(); @@ -134,7 +126,6 @@ void createSchemaJson() { pages.schemaRegistry.isSchemaVisible(SCHEMA_JSON_CREATE); } - @SneakyThrows @DisplayName("should delete JSON schema") @Suite(suiteId = suiteId, title = suiteTitle) @AutomationStatus(status = Status.AUTOMATED) @@ -150,7 +141,6 @@ void deleteSchemaJson() { .isNotVisible(SCHEMA_JSON_API); } - @SneakyThrows @DisplayName("should create PROTOBUF schema") @Suite(suiteId = suiteId, title = suiteTitle) @AutomationStatus(status = Status.AUTOMATED) @@ -162,7 +152,7 @@ void createSchemaProtobuf() { .goToSideMenu(SECOND_LOCAL, MainPage.SideMenuOptions.SCHEMA_REGISTRY); pages.schemaRegistry.clickCreateSchema() .setSubjectName(SCHEMA_PROTOBUF_CREATE) - .setSchemaField(readFileAsString(PATH_PROTOBUF_VALUE)) + .setSchemaField(fileToString(PATH_PROTOBUF_VALUE)) .selectSchemaTypeFromDropdown(SchemaCreateView.SchemaType.PROTOBUF) .clickSubmit() .waitUntilScreenReady(); @@ -171,7 +161,6 @@ void createSchemaProtobuf() { pages.schemaRegistry.isSchemaVisible(SCHEMA_PROTOBUF_CREATE); } - @SneakyThrows @DisplayName("should delete PROTOBUF schema") @Suite(suiteId = suiteId, title = suiteTitle) @AutomationStatus(status = Status.AUTOMATED) diff --git a/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/TopicTests.java b/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/TopicTests.java index 0d6026e1edb..b799e6580bc 100644 --- a/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/TopicTests.java +++ b/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/TopicTests.java @@ -9,10 +9,9 @@ import com.provectus.kafka.ui.utils.qaseIO.annotation.Suite; import io.qameta.allure.Issue; import io.qase.api.annotation.CaseId; -import lombok.SneakyThrows; import org.junit.jupiter.api.*; -import static org.apache.kafka.common.utils.Utils.readFileAsString; +import static com.provectus.kafka.ui.extensions.FileUtils.fileToString; public class TopicTests extends BaseTest { @@ -29,21 +28,18 @@ public class TopicTests extends BaseTest { @BeforeAll - @SneakyThrows public static void beforeAll() { Helpers.INSTANCE.apiHelper.createTopic(SECOND_LOCAL, TOPIC_TO_UPDATE); Helpers.INSTANCE.apiHelper.createTopic(SECOND_LOCAL, TOPIC_TO_DELETE); } @AfterAll - @SneakyThrows public static void afterAll() { Helpers.INSTANCE.apiHelper.deleteTopic(SECOND_LOCAL, TOPIC_TO_UPDATE); Helpers.INSTANCE.apiHelper.deleteTopic(SECOND_LOCAL, TOPIC_TO_DELETE); Helpers.INSTANCE.apiHelper.deleteTopic(SECOND_LOCAL, NEW_TOPIC); } - @SneakyThrows @DisplayName("should create a topic") @Suite(suiteId = 4, title = "Create new Topic") @AutomationStatus(status = Status.AUTOMATED) @@ -65,7 +61,6 @@ public void createTopic() { .topicIsNotVisible(NEW_TOPIC); } @Disabled("Due to issue https://github.com/provectus/kafka-ui/issues/1500 ignore this test") - @SneakyThrows @DisplayName("should update a topic") @Issue("1500") @Suite(suiteId = 2, title = "Topics") @@ -97,7 +92,6 @@ public void updateTopic() { .maxMessageBytesIs(UPDATED_MAX_MESSAGE_BYTES); } - @SneakyThrows @DisplayName("should delete topic") @Suite(suiteId = 2, title = "Topics") @AutomationStatus(status = Status.AUTOMATED) @@ -113,7 +107,6 @@ public void deleteTopic() { .isTopicNotVisible(TOPIC_TO_DELETE); } - @SneakyThrows @DisplayName("produce message") @Suite(suiteId = 2, title = "Topics") @AutomationStatus(status = Status.AUTOMATED) @@ -126,10 +119,10 @@ void produceMessage() { .waitUntilScreenReady() .openTopicMenu(TopicView.TopicMenu.MESSAGES) .clickOnButton("Produce Message") - .setContentFiled(readFileAsString(CONTENT_TO_PRODUCE_MESSAGE)) - .setKeyField(readFileAsString(KEY_TO_PRODUCE_MESSAGE)) + .setContentFiled(fileToString(CONTENT_TO_PRODUCE_MESSAGE)) + .setKeyField(fileToString(KEY_TO_PRODUCE_MESSAGE)) .submitProduceMessage(); - Assertions.assertTrue(pages.topicView.isKeyMessageVisible(readFileAsString(KEY_TO_PRODUCE_MESSAGE))); - Assertions.assertTrue(pages.topicView.isContentMessageVisible(readFileAsString(CONTENT_TO_PRODUCE_MESSAGE).trim())); + Assertions.assertTrue(pages.topicView.isKeyMessageVisible(fileToString(KEY_TO_PRODUCE_MESSAGE))); + Assertions.assertTrue(pages.topicView.isContentMessageVisible(fileToString(CONTENT_TO_PRODUCE_MESSAGE).trim())); } }
train
train
2022-08-19T18:38:35
"2022-08-12T06:41:23Z"
VladSenyuta
train
provectus/kafka-ui/1995_2471
provectus/kafka-ui
provectus/kafka-ui/1995
provectus/kafka-ui/2471
[ "keyword_pr_to_issue" ]
a9c31e6a32c222da2d28896ec2acbccdf7b308c5
e4b5a6606a7fe077ec7a0039eab02b7cf9579016
[ "Hello there lbalcerek! πŸ‘‹\n\nThank you and congratulations πŸŽ‰ for opening your very first issue in this project! πŸ’–\n\nIn case you want to claim this issue, please comment down below! We will try to get back to you as soon as we can. πŸ‘€", "Hey, thanks for reaching out. We'll take a look.", "Steps to reproduce:\r\n1. `provectuslabs/kafka-ui:0.3.0`, create a schema with name \"test/test\"\r\n2. Get a list of schemas. It works fine.\r\n3. Do the same with `provectuslabs/kafka-ui:0.4.0`, request to fetch schemas list fails.\r\n", "Seems the problem is actual on master, newly created Schema with \"/\" sign in Subject, is not available. See attached screen record please.\r\n\r\nhttps://user-images.githubusercontent.com/104780608/187169410-9f59ad78-6ae2-4cd5-80eb-e1194b01571b.mov\r\n\r\n", "@shubhwip wanna take a look?", "> @shubhwip wanna take a look?\r\n\r\nThis is something i mentioned earlier two times on the frontend PR below :) and also shared a video for exact same problem.\r\nhttps://github.com/provectus/kafka-ui/pull/2483#issuecomment-1225243434\r\nhttps://user-images.githubusercontent.com/23444368/186343542-9be868f5-621b-4a2a-9fa5-e49c686465a0.mov\r\nUnfortunately i don't think i have a solution for this :), I tried before already.", "> Seems the problem is actual on master, newly created Schema with \"/\" sign in Subject, is not available. See attached screen record please.\r\n> \r\n> testSchema.mov\r\n\r\n**Just a note** : If you try to manually replace `/` with `%2F` in the browser tab then it will work however it is not a solution. Just mentioning it here.", "> > @shubhwip wanna take a look?\r\n> \r\n> This is something i mentioned earlier two times on the frontend PR below :) and also shared a video for exact same problem. [#2483 (comment)](https://github.com/provectus/kafka-ui/pull/2483#issuecomment-1225243434) https://user-images.githubusercontent.com/23444368/186343542-9be868f5-621b-4a2a-9fa5-e49c686465a0.mov Unfortunately i don't think i have a solution for this :), I tried before already.\r\n\r\nOh well, sorry, I got a feeling you've fixed this eventually. \r\n", "@Kris-K-Dev PTAL https://github.com/provectus/kafka-ui/pull/2483#issuecomment-1225243434", "Actual on master: newly created schema with \"/\" sign is not available\r\n\r\n<img width=\"1621\" alt=\"schema new\" src=\"https://user-images.githubusercontent.com/104780608/190325765-cc0b3947-68bb-4edf-9fbe-f6b2d7600f88.png\">\r\n", "@armenuikafka Can you please share video for steps you followed for reproducing because I just tried from master and it works.\r\n<img width=\"1440\" alt=\"Screenshot 2022-09-15 at 11 51 39 AM\" src=\"https://user-images.githubusercontent.com/23444368/190331576-a2aaf33d-dba8-4768-a4a1-d329c646dddc.png\">\r\n", "The env this has been checked on is pretty much outdated considering the commit tag on the screenshot :)", "@shubhwip there was a problem which is fixed now and everything works as expected !", "@shubhwip thank you :)" ]
[]
"2022-08-19T11:51:33Z"
[ "type/bug", "good first issue", "scope/backend", "scope/frontend", "status/accepted", "status/confirmed" ]
No URL encoding while getting schemas
**Describe the bug** We have subject with '/' sign in it, what causes exception after choosing 'Schema Registry' option. I can see that app requests for: `GET https://schema-registry.nonprod.ipfdigital.io/subjects/apollo/commons/Header.proto/versions/latest` so without proper URL encoding, and should be: `https://schema-registry.nonprod.ipfdigital.io/subjects/apollo%2fcommons%2fHeader.proto/versions/latest` In version 0.3.x everything works fine, something broke in 0.4. **Screenshots** <!-- (If applicable, add screenshots to help explain your problem) --> ![image](https://user-images.githubusercontent.com/16690739/169000231-d76ad178-8d0a-4d33-840b-c5ccf76fd4de.png) **Exception** `{"code":4009,"message":"No such schema apollo/commons/Header.proto with version latest","timestamp":1652860516935,"requestId":"5d9f795f-81176","fieldsErrors":null,"stackTrace":"com.provectus.kafka.ui.exception.SchemaNotFoundException: No such schema apollo/commons/Header.proto with version latest\n\tat com.provectus.kafka.ui.service.SchemaRegistryService.lambda$throwIfNotFoundStatus$20(SchemaRegistryService.java:238)\n\tSuppressed: The stacktrace has been enhanced by Reactor, refer to additional information below: \nError has been observed at the following site(s):\n\t*__checkpoint β‡’ 404 from GET http://schema-registry-service:8081/subjects/apollo/commons/Header.proto/versions/latest [DefaultWebClient]\n\t*__checkpoint β‡’ Handler com.provectus.kafka.ui.controller.SchemasController#getSchemas(String, Integer, Integer, String, ServerWebExchange) [DispatcherHandler]\n\t*__checkpoint β‡’ com.provectus.kafka.ui.config.ReadOnlyModeFilter [DefaultWebFilterChain]\n\t*__checkpoint β‡’ com.provectus.kafka.ui.config.CustomWebFilter [DefaultWebFilterChain]\n\t*__checkpoint β‡’ org.springframework.security.web.server.authorization.AuthorizationWebFilter [DefaultWebFilterChain]\n\t*__checkpoint β‡’ org.springframework.security.web.server.authorization.ExceptionTranslationWebFilter [DefaultWebFilterChain]\n\t*__checkpoint β‡’ org.springframework.security.web.server.authentication.logout.LogoutWebFilter [DefaultWebFilterChain]\n\t*__checkpoint β‡’ org.springframework.security.web.server.savedrequest.ServerRequestCacheWebFilter [DefaultWebFilterChain]\n\t*__checkpoint β‡’ org.springframework.security.web.server.context.SecurityContextServerWebExchangeWebFilter [DefaultWebFilterChain]\n\t*__checkpoint β‡’ org.springframework.security.web.server.context.ReactorContextWebFilter [DefaultWebFilterChain]\n\t*__checkpoint β‡’ org.springframework.security.web.server.header.HttpHeaderWriterWebFilter [DefaultWebFilterChain]\n\t*__checkpoint β‡’ org.springframework.security.config.web.server.ServerHttpSecurity$ServerWebExchangeReactorContextWebFilter [DefaultWebFilterChain]\n\t*__checkpoint β‡’ org.springframework.security.web.server.WebFilterChainProxy [DefaultWebFilterChain]\n\t*__checkpoint β‡’ org.springframework.boot.actuate.metrics.web.reactive.server.MetricsWebFilter [DefaultWebFilterChain]\n\t*__checkpoint β‡’ HTTP GET \"/api/clusters/nonprod-msk-cluster/schemas\" [ExceptionHandlingWebHandler]\nOriginal Stack Trace:\n\t\tat com.provectus.kafka.ui.service.SchemaRegistryService.lambda$throwIfNotFoundStatus$20(SchemaRegistryService.java:238)\n\t\tat org.springframework.web.reactive.function.client.DefaultWebClient$DefaultResponseSpec$StatusHandler.apply(DefaultWebClient.java:695)\n\t\tat org.springframework.web.reactive.function.client.DefaultWebClient$DefaultResponseSpec.applyStatusHandlers(DefaultWebClient.java:654)\n\t\tat org.springframework.web.reactive.function.client.DefaultWebClient$DefaultResponseSpec.handleBodyMono(DefaultWebClient.java:623)\n\t\tat org.springframework.web.reactive.function.client.DefaultWebClient$DefaultResponseSpec.lambda$bodyToMono$2(DefaultWebClient.java:541)\n\t\tat reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:125)\n\t\tat reactor.core.publisher.FluxSwitchIfEmpty$SwitchIfEmptySubscriber.onNext(FluxSwitchIfEmpty.java:74)\n\t\tat reactor.core.publisher.FluxMap$MapSubscriber.onNext(FluxMap.java:120)\n\t\tat reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onNext(FluxOnErrorResume.java:79)\n\t\tat reactor.core.publisher.FluxPeek$PeekSubscriber.onNext(FluxPeek.java:200)\n\t\tat reactor.core.publisher.FluxPeek$PeekSubscriber.onNext(FluxPeek.java:200)\n\t\tat reactor.core.publisher.FluxPeek$PeekSubscriber.onNext(FluxPeek.java:200)\n\t\tat reactor.core.publisher.MonoNext$NextSubscriber.onNext(MonoNext.java:82)\n\t\tat reactor.core.publisher.Operators$ScalarSubscription.request(Operators.java:2398)\n\t\tat reactor.core.publisher.MonoFlatMapMany$FlatMapManyMain.onSubscribeInner(MonoFlatMapMany.java:150)\n\t\tat reactor.core.publisher.MonoFlatMapMany$FlatMapManyMain.onNext(MonoFlatMapMany.java:189)\n\t\tat reactor.core.publisher.SerializedSubscriber.onNext(SerializedSubscriber.java:99)\n\t\tat reactor.core.publisher.FluxRetryWhen$RetryWhenMainSubscriber.onNext(FluxRetryWhen.java:174)\n\t\tat reactor.core.publisher.MonoCreate$DefaultMonoSink.success(MonoCreate.java:165)\n\t\tat reactor.netty.http.client.HttpClientConnect$HttpIOHandlerObserver.onStateChange(HttpClientConnect.java:414)\n\t\tat reactor.netty.ReactorNetty$CompositeConnectionObserver.onStateChange(ReactorNetty.java:671)\n\t\tat reactor.netty.resources.DefaultPooledConnectionProvider$DisposableAcquire.onStateChange(DefaultPooledConnectionProvider.java:183)\n\t\tat reactor.netty.resources.DefaultPooledConnectionProvider$PooledConnection.onStateChange(DefaultPooledConnectionProvider.java:439)\n\t\tat reactor.netty.http.client.HttpClientOperations.onInboundNext(HttpClientOperations.java:637)\n\t\tat reactor.netty.channel.ChannelOperationsHandler.channelRead(ChannelOperationsHandler.java:93)\n\t\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)\n\t\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)\n\t\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)\n\t\tat io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)\n\t\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)\n\t\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)\n\t\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)\n\t\tat io.netty.channel.CombinedChannelDuplexHandler$DelegatingChannelHandlerContext.fireChannelRead(CombinedChannelDuplexHandler.java:436)\n\t\tat io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:327)\n\t\tat io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:314)\n\t\tat io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:435)\n\t\tat io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:279)\n\t\tat io.netty.channel.CombinedChannelDuplexHandler.channelRead(CombinedChannelDuplexHandler.java:251)\n\t\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)\n\t\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)\n\t\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)\n\t\tat io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)\n\t\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)\n\t\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)\n\t\tat io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)\n\t\tat io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:795)\n\t\tat io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:480)\n\t\tat io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:378)\n\t\tat io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)\n\t\tat io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)\n\t\tat io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)\n\t\tat java.base/java.lang.Thread.run(Thread.java:830)\n"}`
[ "kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/SchemaRegistryService.java" ]
[ "kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/SchemaRegistryService.java" ]
[ "kafka-ui-api/src/test/java/com/provectus/kafka/ui/SchemaRegistryServiceTests.java" ]
diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/SchemaRegistryService.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/SchemaRegistryService.java index f4c9355804e..92603ba979e 100644 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/SchemaRegistryService.java +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/SchemaRegistryService.java @@ -381,7 +381,7 @@ private URI buildUri(InternalSchemaRegistry schemaRegistry, String path, List<St final var builder = UriComponentsBuilder .fromHttpUrl(schemaRegistry.getUri() + path); builder.queryParams(queryParams); - return builder.buildAndExpand(uriVariables.toArray()).toUri(); + return builder.build(uriVariables.toArray()); } private Function<ClientResponse, Mono<? extends Throwable>> errorOnSchemaDeleteFailure(String schemaName) {
diff --git a/kafka-ui-api/src/test/java/com/provectus/kafka/ui/SchemaRegistryServiceTests.java b/kafka-ui-api/src/test/java/com/provectus/kafka/ui/SchemaRegistryServiceTests.java index 190831da98b..60959be0492 100644 --- a/kafka-ui-api/src/test/java/com/provectus/kafka/ui/SchemaRegistryServiceTests.java +++ b/kafka-ui-api/src/test/java/com/provectus/kafka/ui/SchemaRegistryServiceTests.java @@ -274,6 +274,21 @@ public void shouldOkWhenCreateNewSchemaThenGetAndUpdateItsCompatibilityLevel() { }); } + @Test + void shouldCreateNewSchemaWhenSubjectIncludesNonAsciiCharacters() { + String schema = + "{\"subject\":\"test/test\",\"schemaType\":\"JSON\",\"schema\":" + + "\"{\\\"type\\\": \\\"string\\\"}\"}"; + + webTestClient + .post() + .uri("/api/clusters/{clusterName}/schemas", LOCAL) + .contentType(MediaType.APPLICATION_JSON) + .body(BodyInserters.fromValue(schema)) + .exchange() + .expectStatus().isOk(); + } + private void createNewSubjectAndAssert(String subject) { webTestClient .post()
test
train
2022-08-24T22:41:22
"2022-05-18T09:00:26Z"
lbalcerek
train
provectus/kafka-ui/2465_2473
provectus/kafka-ui
provectus/kafka-ui/2465
provectus/kafka-ui/2473
[ "connected" ]
56e4cbf60f8a004ca44872d336cf547789ac9100
9e1f8d773f6f492eba422074bf5097bb91c58b9d
[ "Hello there KyriacosP! πŸ‘‹\n\nThank you and congratulations πŸŽ‰ for opening your very first issue in this project! πŸ’–\n\nIn case you want to claim this issue, please comment down below! We will try to get back to you as soon as we can. πŸ‘€", "Hi, thanks for reaching out. We'll look into this gladly, but, is this event streams publicly available for free? After a quick search I haven't found anything like a demo or a free tier thing. ", "i don't think that they have a free tier, sorry. \r\n\r\nedit: sorry about that, they seem to have a free tier called lite plan but you need an ibm cloud account ", "@KyriacosP seems like this platform returns an empty string instead of a proper version number. We're used to verify some of the features availability by kafka's cluster version, which is impossible in this case, I guess.\r\n\r\nPlease try this docker image with a workaround: `public.ecr.aws/provectus/kafka-ui-custom-build:2473`\r\nLet me know how it goes.\r\n\r\n", "This issue has been automatically marked as stale because no requested feedback has been provided. It will be closed if no further activity occurs. Thank you for your contributions.", "@Haarolean Sorry for the late reply i have tested the image. It seems to work fine i still get the same error on startup but only in the logs, kafka-ui works fine after that. Thank you!", "> @Haarolean Sorry for the late reply i have tested the image. It seems to work fine i still get the same error on startup but only in the logs, kafka-ui works fine after that. Thank you!\r\n\r\nGlad to hear. That's intended, at least we know something went wrong. Let me know if you experience any other issue(-s) with event streams.", "Do we have any time frame for when will this fix be released?", "@KyriacosP you can pull `master`-labeled docker image :)", "@Haarolean thank you!" ]
[ "I would add NumberFormatException to throws cause here (even it is non checked)", "I would catch NumberFormatException here instead of Exception", "sure", "okay done" ]
"2022-08-19T14:25:02Z"
[ "type/bug", "scope/backend", "status/accepted", "status/confirmed" ]
Cannot Connect to IBM Event Streams
**Describe the bug** I'm trying to connect to an IBM Event Streams instance (IBMs managed kafka) using kafka-ui. I'm using docker compose to connect with this configuration: ``` --- version: '2' services: kafka-ui: container_name: kafka-ui image: provectuslabs/kafka-ui:latest ports: - 8080:8080 environment: KAFKA_CLUSTERS_0_NAME: sandbox KAFKA_CLUSTERS_0_BOOTSTRAPSERVERS: <multible brokers> KAFKA_CLUSTERS_0_PROPERTIES_SECURITY_PROTOCOL: SASL_SSL KAFKA_CLUSTERS_0_PROPERTIES_SASL_MECHANISM: PLAIN KAFKA_CLUSTERS_0_PROPERTIES_SASL_JAAS_CONFIG: 'org.apache.kafka.common.security.plain.PlainLoginModule required username="xxxxx" password="xxxxx";' KAFKA_CLUSTERS_0_SSL_PROTOCOL: TLSv1.2 ``` I get the following error after the container starts: ``` ERROR [parallel-5] c.p.k.u.u.NumberUtil: Conversion clusterVersion to float value failed kafka-ui | 2022-08-18 10:20:13,209 ERROR [parallel-5] c.p.k.u.s.MetricsService: Failed to collect cluster sandbox info kafka-ui | java.lang.IllegalStateException: Error while creating AdminClient for Cluster sandbox kafka-ui | at com.provectus.kafka.ui.service.AdminClientServiceImpl.lambda$createAdminClient$3(AdminClientServiceImpl.java:45) kafka-ui | at reactor.core.publisher.Mono.lambda$onErrorMap$31(Mono.java:3733) kafka-ui | at reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onError(FluxOnErrorResume.java:94) kafka-ui | at reactor.core.publisher.FluxMapFuseable$MapFuseableSubscriber.onError(FluxMapFuseable.java:140) kafka-ui | at reactor.core.publisher.FluxMapFuseable$MapFuseableSubscriber.onNext(FluxMapFuseable.java:119) kafka-ui | at reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1816) kafka-ui | at reactor.core.publisher.MonoFlatMap$FlatMapInner.onNext(MonoFlatMap.java:249) kafka-ui | at reactor.core.publisher.MonoPublishOn$PublishOnSubscriber.run(MonoPublishOn.java:181) kafka-ui | at reactor.core.scheduler.SchedulerTask.call(SchedulerTask.java:68) kafka-ui | at reactor.core.scheduler.SchedulerTask.call(SchedulerTask.java:28) kafka-ui | at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) kafka-ui | at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) kafka-ui | at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) kafka-ui | at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) kafka-ui | at java.base/java.lang.Thread.run(Thread.java:830) kafka-ui | Caused by: java.lang.NumberFormatException: empty String kafka-ui | at java.base/jdk.internal.math.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:1842) kafka-ui | at java.base/jdk.internal.math.FloatingDecimal.parseFloat(FloatingDecimal.java:122) kafka-ui | at java.base/java.lang.Float.parseFloat(Float.java:461) kafka-ui | at com.provectus.kafka.ui.util.NumberUtil.parserClusterVersion(NumberUtil.java:22) kafka-ui | at com.provectus.kafka.ui.service.ReactiveAdminClient.getSupportedUpdateFeatureForVersion(ReactiveAdminClient.java:88) kafka-ui | at com.provectus.kafka.ui.service.ReactiveAdminClient.lambda$create$0(ReactiveAdminClient.java:84) kafka-ui | at reactor.core.publisher.FluxMapFuseable$MapFuseableSubscriber.onNext(FluxMapFuseable.java:113) kafka-ui | ... 10 common frames omitted ``` I get the same error when trying to connect using a docker container. Any help is appreciated!
[ "kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ReactiveAdminClient.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/util/NumberUtil.java" ]
[ "kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ReactiveAdminClient.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/util/NumberUtil.java" ]
[]
diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ReactiveAdminClient.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ReactiveAdminClient.java index 731824cdf08..85f0d17fb1e 100644 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ReactiveAdminClient.java +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ReactiveAdminClient.java @@ -90,10 +90,15 @@ public static Mono<ReactiveAdminClient> create(AdminClient adminClient) { } private static SupportedFeature getSupportedUpdateFeatureForVersion(String versionStr) { - float version = NumberUtil.parserClusterVersion(versionStr); - return version <= 2.3f - ? SupportedFeature.ALTER_CONFIGS - : SupportedFeature.INCREMENTAL_ALTER_CONFIGS; + try { + float version = NumberUtil.parserClusterVersion(versionStr); + return version <= 2.3f + ? SupportedFeature.ALTER_CONFIGS + : SupportedFeature.INCREMENTAL_ALTER_CONFIGS; + } catch (NumberFormatException e) { + log.info("Assuming non-incremental alter configs due to version parsing error"); + return SupportedFeature.ALTER_CONFIGS; + } } //TODO: discuss - maybe we should map kafka-library's exceptions to our exceptions here diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/util/NumberUtil.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/util/NumberUtil.java index cb1f08b3ab1..9ea8c037cce 100644 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/util/NumberUtil.java +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/util/NumberUtil.java @@ -13,7 +13,8 @@ public static boolean isNumeric(Object value) { return value != null && NumberUtils.isCreatable(value.toString()); } - public static float parserClusterVersion(String version) { + public static float parserClusterVersion(String version) throws NumberFormatException { + log.trace("Parsing cluster version [{}]", version); try { final String[] parts = version.split("\\."); if (parts.length > 2) { @@ -21,7 +22,7 @@ public static float parserClusterVersion(String version) { } return Float.parseFloat(version.split("-")[0]); } catch (Exception e) { - log.error("Conversion clusterVersion {} to float value failed", version); + log.error("Conversion clusterVersion [{}] to float value failed", version, e); throw e; } }
null
train
train
2022-08-30T00:02:41
"2022-08-18T10:34:01Z"
KyriacosP
train
provectus/kafka-ui/2477_2480
provectus/kafka-ui
provectus/kafka-ui/2477
provectus/kafka-ui/2480
[ "connected" ]
61e56f2a1ebf7df80acb61339303a2f58adb17f8
5ac52efb7a2ec64a38110893ef0c35152a6b501b
[ "Hello there AmarendraSingh88! πŸ‘‹\n\nThank you and congratulations πŸŽ‰ for opening your very first issue in this project! πŸ’–\n\nIn case you want to claim this issue, please comment down below! We will try to get back to you as soon as we can. πŸ‘€", "Yes, I want to know the plan for fixing these vulnerabilities and also the next release date.", "Hey, thanks for reaching out.\r\n\r\nWe have periodic CVE scans [enabled](https://github.com/provectus/kafka-ui/actions/workflows/cve.yaml) and just the recent ones were green. We'll take a look into new CVEs.\r\n\r\nAlso, you can always pull `master`-labeled image with the fixes you need, there's no need to wait for a release. \r\n", "@AmarendraSingh88 feel free to pull `master`-labeled image" ]
[]
"2022-08-22T16:43:41Z"
[ "scope/backend", "type/security", "status/accepted" ]
Security Vulnerabilities in the master tag
Hello team, We're planning to use kafka-ui to use its features to manage, troubleshoot and maintain our kafka cluster. As part of that we were doing the security scan for the latest image but it had a lot of vulnerabilities (30+). We then used the master tag of the image, which significantly improves on the Vuln report and now we only see 4 CVEs out of which 2 are critical. Below are the CVEs and their CVSS3.1 score- CVE-2016-1000027 - 9.8 CVE-2022-37434 - 9.8 CVE-2022-25647 - 7.5 CVE-2021-22569 - 5.5 To be able to proceed further, we need the critical vulnerabilities to be fixed. Please let us know- 1. By when the new release will be available (which has many security vuln fixed) 2. What is the plan for fixing the above two critical vuln? Can these be fixed in the upcoming release? Find attached the security report- [Docker_326c2dc_Security_Export.csv](https://github.com/provectus/kafka-ui/files/9392239/Docker_326c2dc_Security_Export.csv)
[ "kafka-ui-api/Dockerfile" ]
[ "kafka-ui-api/Dockerfile" ]
[]
diff --git a/kafka-ui-api/Dockerfile b/kafka-ui-api/Dockerfile index 3990d488315..2cd5fe08e55 100644 --- a/kafka-ui-api/Dockerfile +++ b/kafka-ui-api/Dockerfile @@ -1,4 +1,4 @@ -FROM alpine:3.16.1 +FROM alpine:3.16.2 RUN apk add --no-cache openjdk13-jre libc6-compat gcompat \ && addgroup -S kafkaui && adduser -S kafkaui -G kafkaui
null
test
train
2022-08-22T18:35:09
"2022-08-22T07:36:13Z"
AmarendraSingh88
train
provectus/kafka-ui/2304_2482
provectus/kafka-ui
provectus/kafka-ui/2304
provectus/kafka-ui/2482
[ "connected" ]
38c8a43bc47e9ceae73fbbeb2ed6476560c384ba
ee92ea47cb5153de68c573761b00f158e3349b09
[ "Actual on master: with editing the saved filter, system redirects to custom filters instead of keeping saved filters.", "Works as expected !" ]
[]
"2022-08-22T18:57:35Z"
[ "type/bug", "good first issue", "scope/frontend", "status/accepted", "status/confirmed" ]
System redirects to Custom filters in case of editing the saved filter
**Describe the bug** In case of editing the saved filter, the system redirects to Custom filters instead of staying in Saved filters **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** 1. Navigate to Topics 2. Open the Topic 3. Turn to Messages 4. Press "Add Filters" 5. Turn to Saved filters 6. Edit the existing filter 7. Press "Cancel" or "Save" **Expected behavior** System should stay within Saved Filters instead of turning to custom filters **Screenshots** https://user-images.githubusercontent.com/104780608/179903116-cc242f10-e520-4645-8c96-2777ab73b062.mov
[ "kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/AddFilter.tsx", "kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/FilterModal.tsx", "kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/__tests__/AddFilter.spec.tsx" ]
[ "kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/AddFilter.tsx", "kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/FilterModal.tsx", "kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/__tests__/AddFilter.spec.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/AddFilter.tsx b/kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/AddFilter.tsx index 510fcd03590..0ff1fd1d896 100644 --- a/kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/AddFilter.tsx +++ b/kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/AddFilter.tsx @@ -18,6 +18,8 @@ export interface FilterModalProps { activeFilterHandler(activeFilter: MessageFilters, index: number): void; toggleEditModal(): void; editFilter(value: FilterEdit): void; + isSavedFiltersOpen: boolean; + onClickSavedFilters(newValue: boolean): void; activeFilter?: MessageFilters; } @@ -33,10 +35,10 @@ const AddFilter: React.FC<FilterModalProps> = ({ activeFilterHandler, toggleEditModal, editFilter, + isSavedFiltersOpen, + onClickSavedFilters, activeFilter, }) => { - const [savedFilterState, setSavedFilterState] = - React.useState<boolean>(false); const { isOpen, toggle } = useModal(); const onSubmit = React.useCallback( @@ -71,12 +73,12 @@ const AddFilter: React.FC<FilterModalProps> = ({ {isOpen && <InfoModal toggleIsOpen={toggle} />} </div> </S.FilterTitle> - {savedFilterState ? ( + {isSavedFiltersOpen ? ( <SavedFilters deleteFilter={deleteFilter} activeFilterHandler={activeFilterHandler} closeModal={toggleIsOpen} - onGoBack={() => setSavedFilterState(false)} + onGoBack={() => onClickSavedFilters(!onClickSavedFilters)} filters={filters} onEdit={(index: number, filter: MessageFilters) => { toggleEditModal(); @@ -87,7 +89,7 @@ const AddFilter: React.FC<FilterModalProps> = ({ ) : ( <> <S.SavedFiltersTextContainer - onClick={() => setSavedFilterState(true)} + onClick={() => onClickSavedFilters(!isSavedFiltersOpen)} > <SavedIcon /> <S.SavedFiltersText>Saved Filters</S.SavedFiltersText> </S.SavedFiltersTextContainer> diff --git a/kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/FilterModal.tsx b/kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/FilterModal.tsx index 18c4624e93d..88e5adbcae6 100644 --- a/kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/FilterModal.tsx +++ b/kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/FilterModal.tsx @@ -29,9 +29,13 @@ const FilterModal: React.FC<FilterModalProps> = ({ activeFilter, }) => { const [addFilterModal, setAddFilterModal] = React.useState<boolean>(true); + const [isSavedFiltersOpen, setIsSavedFiltersOpen] = + React.useState<boolean>(false); + const toggleEditModal = () => { setAddFilterModal(!addFilterModal); }; + const [editFilter, setEditFilter] = React.useState<FilterEdit>({ index: -1, filter: { name: '', code: '' }, @@ -40,6 +44,7 @@ const FilterModal: React.FC<FilterModalProps> = ({ setEditFilter(value); setAddFilterModal(!addFilterModal); }; + return ( <S.MessageFilterModal data-testid="messageFilterModal"> {addFilterModal ? ( @@ -51,6 +56,8 @@ const FilterModal: React.FC<FilterModalProps> = ({ activeFilterHandler={activeFilterHandler} toggleEditModal={toggleEditModal} editFilter={editFilterHandler} + isSavedFiltersOpen={isSavedFiltersOpen} + onClickSavedFilters={() => setIsSavedFiltersOpen(!isSavedFiltersOpen)} activeFilter={activeFilter} /> ) : ( diff --git a/kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/__tests__/AddFilter.spec.tsx b/kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/__tests__/AddFilter.spec.tsx index f9c8fe3f1e1..0c12fd33436 100644 --- a/kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/__tests__/AddFilter.spec.tsx +++ b/kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/__tests__/AddFilter.spec.tsx @@ -22,8 +22,10 @@ const renderComponent = (props: Partial<FilterModalProps> = {}) => deleteFilter={jest.fn()} activeFilterHandler={jest.fn()} toggleEditModal={jest.fn()} + onClickSavedFilters={jest.fn()} editFilter={editFilterMock} filters={props.filters || filters} + isSavedFiltersOpen={false} {...props} /> ); @@ -38,8 +40,8 @@ describe('AddFilter component', () => { it('should test click on Saved Filters redirects to Saved components', () => { userEvent.click(screen.getByRole('savedFilterText')); - expect(screen.getByText('Saved filters')).toBeInTheDocument(); - expect(screen.getAllByRole('savedFilter')).toHaveLength(2); + expect(screen.getByText('Saved Filters')).toBeInTheDocument(); + expect(screen.getByRole('savedFilterText')).toBeInTheDocument(); }); it('info button to be in the document', () => { @@ -54,16 +56,9 @@ describe('AddFilter component', () => { ).toBeInTheDocument(); }); - it('should test click on return to custom filter redirects to Add filters', async () => { + it('should test click on return to custom filter redirects to Saved Filters', async () => { userEvent.click(screen.getByRole('savedFilterText')); - expect(screen.getByText('Saved filters')).toBeInTheDocument(); - expect(screen.queryByRole('savedFilterText')).not.toBeInTheDocument(); - expect(screen.getAllByRole('savedFilter')).toHaveLength(2); - - await act(() => - userEvent.click(screen.getByText(/back to custom filters/i)) - ); expect(screen.queryByText('Saved filters')).not.toBeInTheDocument(); expect(screen.getByRole('savedFilterText')).toBeInTheDocument(); }); @@ -112,6 +107,9 @@ describe('AddFilter component', () => { }); it('calls editFilter when edit button is clicked in saved filters', async () => { + await act(() => { + renderComponent({ isSavedFiltersOpen: true }); + }); userEvent.click(screen.getByText('Saved Filters')); const index = 0; const editButton = screen.getAllByText('Edit')[index];
null
train
train
2022-09-13T04:04:26
"2022-07-20T05:26:28Z"
armenuikafka
train
provectus/kafka-ui/2392_2485
provectus/kafka-ui
provectus/kafka-ui/2392
provectus/kafka-ui/2485
[ "keyword_pr_to_issue", "timestamp(timedelta=1.0, similarity=0.8629827200014143)" ]
0aafd49de0064854525e4171a75c28c2dae6fbaa
a9c31e6a32c222da2d28896ec2acbccdf7b308c5
[]
[]
"2022-08-24T06:21:32Z"
[ "type/bug", "good first issue", "scope/frontend", "status/accepted", "status/confirmed" ]
System redirects to Topics list with cancelling Topic edit instead of turning to Topic profile
**Describe the bug** System redirects to Topics list with cancelling Topic edit instead of redirecting to Topic profile **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** <!-- We'd like you to provide an example setup (via docker-compose, helm, etc.) to reproduce the problem, especially with a complex setups. --> Steps to reproduce the behavior: 1. Navigate to Topics 2. Select the Topic 3. Press "Edit Settings" from menu 4. Cancel the editing **Expected behavior** System should turn to Topic profile instead of Topics list **Screenshots** https://user-images.githubusercontent.com/104780608/182797421-2ae85414-2aee-4237-a242-634b932409a0.mov
[ "kafka-ui-react-app/src/components/Topics/shared/Form/TopicForm.tsx" ]
[ "kafka-ui-react-app/src/components/Topics/shared/Form/TopicForm.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/Topics/shared/Form/TopicForm.tsx b/kafka-ui-react-app/src/components/Topics/shared/Form/TopicForm.tsx index 1838e209ba8..20898e8de6d 100644 --- a/kafka-ui-react-app/src/components/Topics/shared/Form/TopicForm.tsx +++ b/kafka-ui-react-app/src/components/Topics/shared/Form/TopicForm.tsx @@ -9,7 +9,7 @@ import { Button } from 'components/common/Button/Button'; import { InputLabel } from 'components/common/Input/InputLabel.styled'; import { FormError } from 'components/common/Input/Input.styled'; import { StyledForm } from 'components/common/Form/Form.styled'; -import { clusterTopicsPath } from 'lib/paths'; +import { clusterTopicPath } from 'lib/paths'; import { useNavigate } from 'react-router-dom'; import useAppParams from 'lib/hooks/useAppParams'; @@ -76,7 +76,7 @@ const TopicForm: React.FC<Props> = ({ const onCancel = () => { reset(); - navigate(clusterTopicsPath(clusterName)); + navigate(clusterTopicPath(clusterName, topicName)); }; return (
null
train
train
2022-08-23T11:35:04
"2022-08-04T08:20:52Z"
armenuikafka
train
provectus/kafka-ui/2393_2492
provectus/kafka-ui
provectus/kafka-ui/2393
provectus/kafka-ui/2492
[ "keyword_pr_to_issue", "timestamp(timedelta=1.0, similarity=0.9999999999999998)" ]
63a451452250eb20e2230afaac6516604f639d69
89a2c8d9204aa7f7daf3e726e19070c4166a67dd
[]
[]
"2022-08-26T10:22:21Z"
[ "type/bug", "good first issue", "status/accepted", "status/confirmed", "scope/k8s" ]
[Helm] base64 encode secrets
### Discussed in https://github.com/provectus/kafka-ui/discussions/2389 <div type='discussions-op-text'> <sup>Originally posted by **cignul9** August 4, 2022</sup> When using envs.secret to store...secrets the chart fails because values are not encoded in the template. We can encode them beforehand, but that's not very user friendly. Would you consider changing kafka-ui/charts/kafka-ui/templates/secret.yaml line 9 from ``` {{- toYaml .Values.envs.secret | nindent 2 }} ``` to ``` {{- toYaml .Values.envs.secret | b64enc | nindent 2 }} ``` ?</div>
[ "charts/kafka-ui/Chart.yaml", "charts/kafka-ui/templates/secret.yaml" ]
[ "charts/kafka-ui/Chart.yaml", "charts/kafka-ui/templates/secret.yaml" ]
[]
diff --git a/charts/kafka-ui/Chart.yaml b/charts/kafka-ui/Chart.yaml index 7eaff3c59aa..e0d9b0b1611 100644 --- a/charts/kafka-ui/Chart.yaml +++ b/charts/kafka-ui/Chart.yaml @@ -2,6 +2,6 @@ apiVersion: v2 name: kafka-ui description: A Helm chart for kafka-UI type: application -version: 0.4.2 +version: 0.4.3 appVersion: latest icon: https://github.com/provectus/kafka-ui/raw/master/documentation/images/kafka-ui-logo.png diff --git a/charts/kafka-ui/templates/secret.yaml b/charts/kafka-ui/templates/secret.yaml index a2ebf0fdba8..a2d1f25fa2f 100644 --- a/charts/kafka-ui/templates/secret.yaml +++ b/charts/kafka-ui/templates/secret.yaml @@ -6,4 +6,6 @@ metadata: {{- include "kafka-ui.labels" . | nindent 4 }} type: Opaque data: - {{- toYaml .Values.envs.secret | nindent 2 }} \ No newline at end of file + {{- range $key, $val := .Values.envs.secret }} + {{ $key }}: {{ $val | b64enc | quote }} + {{- end -}}
null
val
train
2022-08-29T12:04:27
"2022-08-04T10:56:15Z"
Haarolean
train
provectus/kafka-ui/2365_2493
provectus/kafka-ui
provectus/kafka-ui/2365
provectus/kafka-ui/2493
[ "keyword_pr_to_issue" ]
01127d8f10e193475895bf5a17c2e53d17d6ea0e
5ff65e447215b7beb033945df0f95b719294998e
[]
[]
"2022-08-26T11:27:56Z"
[ "type/bug", "good first issue", "scope/frontend", "status/accepted", "status/confirmed" ]
Only content data displayed with "Save as a file" and "Copy to clipboard" the Message
**Describe the bug** Only content data displayed with "Save as a file" and "Copy to clipboard" the Message **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** <!-- We'd like you to provide an example setup (via docker-compose, helm, etc.) to reproduce the problem, especially with a complex setups. --> Steps to reproduce the behavior: 1. Navigate to Topics 2. Open the Messages tab for one Topic 3. Crete the Messages with Key, Content and Headers 4. Press the "Save as a file" / "Copy to clipboard" from menu **Expected behavior** All the Message's data should be copied and exist in a saved file. **Screenshots** https://user-images.githubusercontent.com/104780608/182073185-5aaaa867-1a56-4a2c-bbfd-e8fc242f3bc6.mov **Additional context** <!-- (Add any other context about the problem here) -->
[ "kafka-ui-react-app/src/components/Topics/Topic/Messages/Message.tsx", "kafka-ui-react-app/src/lib/hooks/__tests__/useDataSaver.spec.tsx", "kafka-ui-react-app/src/lib/hooks/useDataSaver.ts" ]
[ "kafka-ui-react-app/src/components/Topics/Topic/Messages/Message.tsx", "kafka-ui-react-app/src/lib/hooks/__tests__/useDataSaver.spec.tsx", "kafka-ui-react-app/src/lib/hooks/useDataSaver.ts" ]
[]
diff --git a/kafka-ui-react-app/src/components/Topics/Topic/Messages/Message.tsx b/kafka-ui-react-app/src/components/Topics/Topic/Messages/Message.tsx index 3cba8d058ec..3584aee7973 100644 --- a/kafka-ui-react-app/src/components/Topics/Topic/Messages/Message.tsx +++ b/kafka-ui-react-app/src/components/Topics/Topic/Messages/Message.tsx @@ -40,9 +40,18 @@ const Message: React.FC<Props> = ({ }, }) => { const [isOpen, setIsOpen] = React.useState(false); + const savedMessageJson = { + Content: content, + Offset: offset, + Key: key, + Partition: partition, + Headers: headers, + Timestamp: timestamp, + }; + const savedMessage = JSON.stringify(savedMessageJson, null, '\t'); const { copyToClipboard, saveFile } = useDataSaver( 'topic-message', - content || '' + savedMessage || '' ); const toggleIsOpen = () => setIsOpen(!isOpen); diff --git a/kafka-ui-react-app/src/lib/hooks/__tests__/useDataSaver.spec.tsx b/kafka-ui-react-app/src/lib/hooks/__tests__/useDataSaver.spec.tsx index 81ef81a96af..510fb955455 100644 --- a/kafka-ui-react-app/src/lib/hooks/__tests__/useDataSaver.spec.tsx +++ b/kafka-ui-react-app/src/lib/hooks/__tests__/useDataSaver.spec.tsx @@ -32,11 +32,7 @@ describe('useDataSaver hook', () => { render(<HookWrapper />); expect(mockCreate).toHaveBeenCalledTimes(2); expect(link.download).toEqual('message_1616581196000.json'); - expect(link.href).toEqual( - `data:text/json;charset=utf-8,${encodeURIComponent( - JSON.stringify(content) - )}` - ); + expect(link.href).toEqual(`data:text/json;charset=utf-8,${content}`); expect(link.click).toHaveBeenCalledTimes(1); mockCreate.mockRestore(); @@ -59,11 +55,7 @@ describe('useDataSaver hook', () => { render(<HookWrapper />); expect(mockCreate).toHaveBeenCalledTimes(2); expect(link.download).toEqual('message_1616581196000.txt'); - expect(link.href).toEqual( - `data:text/json;charset=utf-8,${encodeURIComponent( - JSON.stringify('content') - )}` - ); + expect(link.href).toEqual(`data:text/json;charset=utf-8,content`); expect(link.click).toHaveBeenCalledTimes(1); mockCreate.mockRestore(); diff --git a/kafka-ui-react-app/src/lib/hooks/useDataSaver.ts b/kafka-ui-react-app/src/lib/hooks/useDataSaver.ts index 20c9cd2ac2b..6dba2700be8 100644 --- a/kafka-ui-react-app/src/lib/hooks/useDataSaver.ts +++ b/kafka-ui-react-app/src/lib/hooks/useDataSaver.ts @@ -20,9 +20,7 @@ const useDataSaver = ( const saveFile = () => { const extension = isObject(data) ? 'json' : 'txt'; - const dataStr = `data:text/json;charset=utf-8,${encodeURIComponent( - JSON.stringify(data) - )}`; + const dataStr = `data:text/json;charset=utf-8,${data}`; const downloadAnchorNode = document.createElement('a'); downloadAnchorNode.setAttribute('href', dataStr); downloadAnchorNode.setAttribute(
null
train
train
2022-08-26T18:51:00
"2022-08-01T04:36:42Z"
armenuikafka
train
provectus/kafka-ui/2197_2495
provectus/kafka-ui
provectus/kafka-ui/2197
provectus/kafka-ui/2495
[ "keyword_pr_to_issue" ]
fc946a1dd1d34758eb35c7382595faaac1ab5cc1
6df2d0b602f416b5b90d306a8cc7125e37cb08ef
[ "#2152", "Can I work on this please @Haarolean ?", "@shubhwip sure!", "@Haarolean Having looked at #2152 I think we can solve this problem just by disabling the `schema type` during edit. This way we can stop users to change schema type during schema edit :).\r\nAlso I checked if there is any PUT endpoint on schema registry service but seems like there is nothing.\r\nhttps://docs.confluent.io/platform/current/schema-registry/develop/api.html#subjects\r\n\r\nSubmitted the PR for the above change #2495\r\n", "wontfix: https://github.com/provectus/kafka-ui/pull/2495#issuecomment-1231478236" ]
[ "```suggestion\r\n description: upon updating a schema, the type of an existing schema can't be changed\r\n```\r\nmade it a bit clearer", "```suggestion\r\n description: upon updating a schema, the type of an existing schema can't be changed\r\n```", "```suggestion\r\n description: should be set for creating/updating schema subject\r\n```" ]
"2022-08-28T14:01:33Z"
[ "type/enhancement", "status/wontfix", "scope/backend" ]
Make a separate endpoint for updating schemas
Currently there's just one endpoint for both creating and updating a schema. Schema type param shouldn't be available when updating an existing schema.
[ "kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml", "kafka-ui-react-app/src/components/Schemas/Edit/Edit.tsx" ]
[ "kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml", "kafka-ui-react-app/src/components/Schemas/Edit/Edit.tsx" ]
[]
diff --git a/kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml b/kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml index c603b9dcc90..f64d8ce4da7 100644 --- a/kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml +++ b/kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml @@ -898,7 +898,7 @@ paths: post: tags: - Schemas - summary: create a new subject schema + summary: create a new subject schema or update existing subject schema operationId: createNewSchema parameters: - name: clusterName @@ -2594,6 +2594,7 @@ components: NewSchemaSubject: type: object + description: should be set for creating/updating schema subject properties: subject: type: string @@ -2601,6 +2602,7 @@ components: type: string schemaType: $ref: '#/components/schemas/SchemaType' + description: upon updating a schema, the type of an existing schema can't be changed required: - subject - schema @@ -2624,6 +2626,7 @@ components: SchemaType: type: string + description: upon updating a schema, the type of an existing schema can't be changed enum: - AVRO - JSON diff --git a/kafka-ui-react-app/src/components/Schemas/Edit/Edit.tsx b/kafka-ui-react-app/src/components/Schemas/Edit/Edit.tsx index 01ab9802faf..a09f45a568c 100644 --- a/kafka-ui-react-app/src/components/Schemas/Edit/Edit.tsx +++ b/kafka-ui-react-app/src/components/Schemas/Edit/Edit.tsx @@ -124,7 +124,7 @@ const Edit: React.FC = () => { value={schema.schemaType} onChange={onChange} minWidth="100%" - disabled={isSubmitting} + disabled options={Object.keys(SchemaType).map((type) => ({ value: type, label: type,
null
val
train
2022-09-15T03:00:46
"2022-06-21T15:45:00Z"
Haarolean
train
provectus/kafka-ui/2152_2495
provectus/kafka-ui
provectus/kafka-ui/2152
provectus/kafka-ui/2495
[ "keyword_pr_to_issue", "timestamp(timedelta=1.0, similarity=0.857091588526797)" ]
fc946a1dd1d34758eb35c7382595faaac1ab5cc1
6df2d0b602f416b5b90d306a8cc7125e37cb08ef
[ "Hello there kshpilchyna! πŸ‘‹\n\nThank you and congratulations πŸŽ‰ for opening your very first issue in this project! πŸ’–\n\nIn case you want to claim this issue, please comment down below! We will try to get back to you as soon as we can. πŸ‘€", "@VladSenyuta fyi. We had an e2e test for this or something." ]
[ "```suggestion\r\n description: upon updating a schema, the type of an existing schema can't be changed\r\n```\r\nmade it a bit clearer", "```suggestion\r\n description: upon updating a schema, the type of an existing schema can't be changed\r\n```", "```suggestion\r\n description: should be set for creating/updating schema subject\r\n```" ]
"2022-08-28T14:01:33Z"
[ "type/bug", "good first issue", "scope/backend", "scope/frontend", "status/accepted", "status/confirmed" ]
Schema registry: Should not be possible to change a schema type during editing
Steps: 1. Schema 'AVRO' created 2. Open created schema 3. Select 'Edit Schema' 4. Try to change a type Actual result: 3. It's possible to change a type of schema <img width="1541" alt="Screenshot 2022-06-10 at 18 20 42" src="https://user-images.githubusercontent.com/79516954/173110078-89cf8ec9-8897-40be-bad8-2574301f7bca.png"> Expected result: 3. Should not be possible to change a type, so it's better to remove 'Type' from editing
[ "kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml", "kafka-ui-react-app/src/components/Schemas/Edit/Edit.tsx" ]
[ "kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml", "kafka-ui-react-app/src/components/Schemas/Edit/Edit.tsx" ]
[]
diff --git a/kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml b/kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml index c603b9dcc90..f64d8ce4da7 100644 --- a/kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml +++ b/kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml @@ -898,7 +898,7 @@ paths: post: tags: - Schemas - summary: create a new subject schema + summary: create a new subject schema or update existing subject schema operationId: createNewSchema parameters: - name: clusterName @@ -2594,6 +2594,7 @@ components: NewSchemaSubject: type: object + description: should be set for creating/updating schema subject properties: subject: type: string @@ -2601,6 +2602,7 @@ components: type: string schemaType: $ref: '#/components/schemas/SchemaType' + description: upon updating a schema, the type of an existing schema can't be changed required: - subject - schema @@ -2624,6 +2626,7 @@ components: SchemaType: type: string + description: upon updating a schema, the type of an existing schema can't be changed enum: - AVRO - JSON diff --git a/kafka-ui-react-app/src/components/Schemas/Edit/Edit.tsx b/kafka-ui-react-app/src/components/Schemas/Edit/Edit.tsx index 01ab9802faf..a09f45a568c 100644 --- a/kafka-ui-react-app/src/components/Schemas/Edit/Edit.tsx +++ b/kafka-ui-react-app/src/components/Schemas/Edit/Edit.tsx @@ -124,7 +124,7 @@ const Edit: React.FC = () => { value={schema.schemaType} onChange={onChange} minWidth="100%" - disabled={isSubmitting} + disabled options={Object.keys(SchemaType).map((type) => ({ value: type, label: type,
null
train
train
2022-09-15T03:00:46
"2022-06-10T16:26:34Z"
kshpilchyna
train
provectus/kafka-ui/2432_2496
provectus/kafka-ui
provectus/kafka-ui/2432
provectus/kafka-ui/2496
[ "timestamp(timedelta=1.0, similarity=0.9817753675188544)", "connected" ]
63a451452250eb20e2230afaac6516604f639d69
56e4cbf60f8a004ca44872d336cf547789ac9100
[]
[ "let's do this via env props", "done" ]
"2022-08-29T06:15:00Z"
[ "type/documentation", "scope/QA", "status/accepted" ]
[e2e] update read.me file for 'kafka-ui-e2e-checks' module with actual steps
need to update with steps: cd kafka-ui docker-compose -f documentation/compose/kafka-ui-connectors.yaml up -d // to create containers add token // for Qase integration mvn -pl '!kafka-ui-api' test -Pprod // to run tests reports: allure serve // to get reports
[ "kafka-ui-e2e-checks/README.md" ]
[ "kafka-ui-e2e-checks/README.md" ]
[]
diff --git a/kafka-ui-e2e-checks/README.md b/kafka-ui-e2e-checks/README.md index 25dc51d658b..4cd986dc475 100644 --- a/kafka-ui-e2e-checks/README.md +++ b/kafka-ui-e2e-checks/README.md @@ -45,58 +45,20 @@ docker pull selenoid/vnc:chrome_86.0 ### How to run checks -1. Run `kafka-ui` +1. Run `kafka-ui`: ``` -cd docker -docker-compose -f kafka-ui.yaml up -d +cd kafka-ui +docker-compose -f documentation/compose/kafka-ui-connectors.yaml up -d ``` -2. Run `selenoid-ui` +2. Run tests using your QaseIO API token as environment variable (put instead $s into command below) ``` -cd kafka-ui-e2e-checks/docker -docker-compose -f selenoid.yaml up -d +mvn -DQASEIO_API_TOKEN=β€˜%s’ -pl β€˜!kafka-ui-api’ test -Pprod ``` -3. Compile `kafka-ui-contract` project -``` -cd <projectRoot>/kafka-ui-contract -mvn clean compile -``` -4. Run checks -``` -cd kafka-ui-e2e-checks -mvn test -``` - -* There are several ways to run checks - -1. If you don't have selenoid run on your machine -``` - mvn test -DSHOULD_START_SELENOID=true -``` -⚠️ If you want to run checks in IDE with this approach, you'd need to set up -environment variable(`SHOULD_START_SELENOID=true`) in `Run/Edit Configurations..` - -2. For development purposes it is better to just start separate selenoid in docker-compose -Do it in separate window -``` -cd docker -docker-compose -f selenoid.yaml up -``` -Then you can just `mvn test`. By default, `SELENOID_URL` will resolve to `http://localhost:4444/wd/hub` - -It's preferred way to run. - -* If you have remote selenoid instance, set - -`SELENOID_URL` environment variable - -Example: -`mvn test -DSELENOID_URL=http://localhost:4444/wd/hub` -That's the way to run tests in CI with selenoid set up somewhere in cloud ### Reporting Reports are in `allure-results` folder. -If you have installed allure commandline(e.g. like [here](https://docs.qameta.io/allure/#_installing_a_commandline) or [here](https://www.npmjs.com/package/allure-commandline)) +If you have installed allure commandline [here](https://www.npmjs.com/package/allure-commandline)) You can see allure report with command: ``` allure serve
null
train
train
2022-08-29T12:04:27
"2022-08-12T06:43:14Z"
VladSenyuta
train
provectus/kafka-ui/2502_2503
provectus/kafka-ui
provectus/kafka-ui/2502
provectus/kafka-ui/2503
[ "keyword_pr_to_issue" ]
9e1f8d773f6f492eba422074bf5097bb91c58b9d
b135594e3f141ee2d339ee999674997f53437a0a
[ "@shubhwip I want it similar to kafka-ui.yaml or kafka-ui-connectors-only.yaml but with a few differences:\r\n1. Let's keep it lightweight, with just one copy of each container (like without 2 kafkas)\r\n2. Let's get rid of zookeeper if it's still there\r\nThanks!" ]
[ "can we place this under a subdirectory, like `/compose/scripts/`?", "also we can add this to `DOCKER_COMPOSE.md`" ]
"2022-08-30T10:35:43Z"
[ "type/documentation", "status/accepted" ]
Create a compose file with arm images
https://github.com/provectus/kafka-ui/pull/2453#discussion_r958158239
[ "documentation/compose/DOCKER_COMPOSE.md" ]
[ "documentation/compose/DOCKER_COMPOSE.md", "documentation/compose/kafka-ui-arm64.yaml", "documentation/compose/scripts/update_run.sh" ]
[]
diff --git a/documentation/compose/DOCKER_COMPOSE.md b/documentation/compose/DOCKER_COMPOSE.md index 2ea3f09c990..354a22c7853 100644 --- a/documentation/compose/DOCKER_COMPOSE.md +++ b/documentation/compose/DOCKER_COMPOSE.md @@ -1,12 +1,13 @@ # Descriptions of docker-compose configurations (*.yaml) 1. [kafka-ui.yaml](./kafka-ui.yaml) - Default configuration with 2 kafka clusters with two nodes of Schema Registry, one kafka-connect and a few dummy topics. -2. [kafka-clusters-only.yaml](./kafka-clusters-only.yaml) - A configuration for development purposes, everything besides `kafka-ui` itself (to be run locally). -3. [kafka-ui-ssl.yml](./kafka-ssl.yml) - Connect to Kafka via TLS/SSL -4. [kafka-cluster-sr-auth.yaml](./kafka-cluster-sr-auth.yaml) - Schema registry with authentication. -5. [kafka-ui-auth-context.yaml](./kafka-ui-auth-context.yaml) - Basic (username/password) authentication with custom path (URL) (issue 861). -6. [kafka-ui-connectors.yaml](./kafka-ui-connectors.yaml) - Configuration with different connectors (github-source, s3, sink-activities, source-activities) and Ksql functionality. -7. [kafka-ui-jmx-secured.yml](./kafka-ui-jmx-secured.yml) - Kafka’s JMX with SSL and authentication. -8. [kafka-ui-reverse-proxy.yaml](./kafka-ui-reverse-proxy.yaml) - An example for using the app behind a proxy (like nginx). -9. [kafka-ui-sasl.yaml](./kafka-ui-sasl.yaml) - SASL auth for Kafka. -10. [kafka-ui-traefik-proxy.yaml](./kafka-ui-traefik-proxy.yaml) - Traefik specific proxy configuration. +2. [kafka-ui-arm64.yaml](./kafka-ui-arm64.yaml) - Default configuration for ARM64(Mac M1) architecture with 1 kafka cluster without zookeeper with one node of Schema Registry, one kafka-connect and a few dummy topics. +3. [kafka-clusters-only.yaml](./kafka-clusters-only.yaml) - A configuration for development purposes, everything besides `kafka-ui` itself (to be run locally). +4. [kafka-ui-ssl.yml](./kafka-ssl.yml) - Connect to Kafka via TLS/SSL +5. [kafka-cluster-sr-auth.yaml](./kafka-cluster-sr-auth.yaml) - Schema registry with authentication. +6. [kafka-ui-auth-context.yaml](./kafka-ui-auth-context.yaml) - Basic (username/password) authentication with custom path (URL) (issue 861). +7. [kafka-ui-connectors.yaml](./kafka-ui-connectors.yaml) - Configuration with different connectors (github-source, s3, sink-activities, source-activities) and Ksql functionality. +8. [kafka-ui-jmx-secured.yml](./kafka-ui-jmx-secured.yml) - Kafka’s JMX with SSL and authentication. +9. [kafka-ui-reverse-proxy.yaml](./kafka-ui-reverse-proxy.yaml) - An example for using the app behind a proxy (like nginx). +10. [kafka-ui-sasl.yaml](./kafka-ui-sasl.yaml) - SASL auth for Kafka. +11. [kafka-ui-traefik-proxy.yaml](./kafka-ui-traefik-proxy.yaml) - Traefik specific proxy configuration. diff --git a/documentation/compose/kafka-ui-arm64.yaml b/documentation/compose/kafka-ui-arm64.yaml new file mode 100644 index 00000000000..70134c6b52a --- /dev/null +++ b/documentation/compose/kafka-ui-arm64.yaml @@ -0,0 +1,105 @@ +# This compose file uses kafka cluster without zookeeper +# Kafka without zookeeper is supported after image tag 6.2.0 +# ARM64 supported images for kafka can be found here +# https://hub.docker.com/r/confluentinc/cp-kafka/tags?page=1&name=arm64 +--- +version: '2' +services: + kafka-ui: + container_name: kafka-ui + image: provectuslabs/kafka-ui:latest + ports: + - 8080:8080 + depends_on: + - kafka0 + - schemaregistry0 + - kafka-connect0 + environment: + KAFKA_CLUSTERS_0_NAME: local + KAFKA_CLUSTERS_0_BOOTSTRAPSERVERS: kafka0:29092 + KAFKA_CLUSTERS_0_JMXPORT: 9997 + KAFKA_CLUSTERS_0_SCHEMAREGISTRY: http://schemaregistry0:8085 + KAFKA_CLUSTERS_0_KAFKACONNECT_0_NAME: first + KAFKA_CLUSTERS_0_KAFKACONNECT_0_ADDRESS: http://kafka-connect0:8083 + + kafka0: + image: confluentinc/cp-kafka:7.0.5.arm64 + hostname: kafka0 + container_name: kafka0 + ports: + - 9092:9092 + - 9997:9997 + environment: + KAFKA_BROKER_ID: 1 + KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,CONTROLLER:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT + KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka0:29092,PLAINTEXT_HOST://localhost:9092 + KAFKA_INTER_BROKER_LISTENER_NAME: PLAINTEXT + KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1 + KAFKA_GROUP_INITIAL_REBALANCE_DELAY_MS: 0 + KAFKA_TRANSACTION_STATE_LOG_MIN_ISR: 1 + KAFKA_TRANSACTION_STATE_LOG_REPLICATION_FACTOR: 1 + KAFKA_PROCESS_ROLES: 'broker,controller' + KAFKA_NODE_ID: 1 + KAFKA_CONTROLLER_QUORUM_VOTERS: '1@kafka0:29093' + KAFKA_LISTENERS: 'PLAINTEXT://kafka0:29092,CONTROLLER://kafka0:29093,PLAINTEXT_HOST://0.0.0.0:9092' + KAFKA_CONTROLLER_LISTENER_NAMES: 'CONTROLLER' + KAFKA_LOG_DIRS: '/tmp/kraft-combined-logs' + JMX_PORT: 9997 + KAFKA_JMX_OPTS: -Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote.authenticate=false -Dcom.sun.management.jmxremote.ssl=false -Djava.rmi.server.hostname=kafka0 -Dcom.sun.management.jmxremote.rmi.port=9997 + volumes: + - ./scripts/update_run.sh:/tmp/update_run.sh + command: "bash -c 'if [ ! -f /tmp/update_run.sh ]; then echo \"ERROR: Did you forget the update_run.sh file that came with this docker-compose.yml file?\" && exit 1 ; else /tmp/update_run.sh && /etc/confluent/docker/run ; fi'" + + schemaregistry0: + image: confluentinc/cp-schema-registry:7.0.5.arm64 + ports: + - 8085:8085 + depends_on: + - kafka0 + environment: + SCHEMA_REGISTRY_KAFKASTORE_BOOTSTRAP_SERVERS: PLAINTEXT://kafka0:29092 + SCHEMA_REGISTRY_KAFKASTORE_SECURITY_PROTOCOL: PLAINTEXT + SCHEMA_REGISTRY_HOST_NAME: schemaregistry0 + SCHEMA_REGISTRY_LISTENERS: http://schemaregistry0:8085 + + SCHEMA_REGISTRY_SCHEMA_REGISTRY_INTER_INSTANCE_PROTOCOL: "http" + SCHEMA_REGISTRY_LOG4J_ROOT_LOGLEVEL: INFO + SCHEMA_REGISTRY_KAFKASTORE_TOPIC: _schemas + + kafka-connect0: + image: confluentinc/cp-kafka-connect:7.0.5.arm64 + ports: + - 8083:8083 + depends_on: + - kafka0 + - schemaregistry0 + environment: + CONNECT_BOOTSTRAP_SERVERS: kafka0:29092 + CONNECT_GROUP_ID: compose-connect-group + CONNECT_CONFIG_STORAGE_TOPIC: _connect_configs + CONNECT_CONFIG_STORAGE_REPLICATION_FACTOR: 1 + CONNECT_OFFSET_STORAGE_TOPIC: _connect_offset + CONNECT_OFFSET_STORAGE_REPLICATION_FACTOR: 1 + CONNECT_STATUS_STORAGE_TOPIC: _connect_status + CONNECT_STATUS_STORAGE_REPLICATION_FACTOR: 1 + CONNECT_KEY_CONVERTER: org.apache.kafka.connect.storage.StringConverter + CONNECT_KEY_CONVERTER_SCHEMA_REGISTRY_URL: http://schemaregistry0:8085 + CONNECT_VALUE_CONVERTER: org.apache.kafka.connect.storage.StringConverter + CONNECT_VALUE_CONVERTER_SCHEMA_REGISTRY_URL: http://schemaregistry0:8085 + CONNECT_INTERNAL_KEY_CONVERTER: org.apache.kafka.connect.json.JsonConverter + CONNECT_INTERNAL_VALUE_CONVERTER: org.apache.kafka.connect.json.JsonConverter + CONNECT_REST_ADVERTISED_HOST_NAME: kafka-connect0 + CONNECT_PLUGIN_PATH: "/usr/share/java,/usr/share/confluent-hub-components" + + kafka-init-topics: + image: confluentinc/cp-kafka:7.0.5.arm64 + volumes: + - ./message.json:/data/message.json + depends_on: + - kafka0 + command: "bash -c 'echo Waiting for Kafka to be ready... && \ + cub kafka-ready -b kafka0:29092 1 30 && \ + kafka-topics --create --topic second.users --partitions 3 --replication-factor 1 --if-not-exists --bootstrap-server kafka0:29092 && \ + kafka-topics --create --topic second.messages --partitions 2 --replication-factor 1 --if-not-exists --bootstrap-server kafka0:29092 && \ + kafka-topics --create --topic first.messages --partitions 2 --replication-factor 1 --if-not-exists --bootstrap-server kafka0:29092 && \ + kafka-console-producer --broker-list kafka0:29092 -topic second.users < /data/message.json'" diff --git a/documentation/compose/scripts/update_run.sh b/documentation/compose/scripts/update_run.sh new file mode 100755 index 00000000000..023c832b4e1 --- /dev/null +++ b/documentation/compose/scripts/update_run.sh @@ -0,0 +1,11 @@ +# This script is required to run kafka cluster (without zookeeper) +#!/bin/sh + +# Docker workaround: Remove check for KAFKA_ZOOKEEPER_CONNECT parameter +sed -i '/KAFKA_ZOOKEEPER_CONNECT/d' /etc/confluent/docker/configure + +# Docker workaround: Ignore cub zk-ready +sed -i 's/cub zk-ready/echo ignore zk-ready/' /etc/confluent/docker/ensure + +# KRaft required step: Format the storage directory with a new cluster ID +echo "kafka-storage format --ignore-formatted -t $(kafka-storage random-uuid) -c /etc/kafka/kafka.properties" >> /etc/confluent/docker/ensure \ No newline at end of file
null
train
train
2022-08-30T10:06:05
"2022-08-30T08:16:20Z"
Haarolean
train
provectus/kafka-ui/2180_2508
provectus/kafka-ui
provectus/kafka-ui/2180
provectus/kafka-ui/2508
[ "keyword_pr_to_issue", "timestamp(timedelta=1.0, similarity=0.9145114253556753)" ]
77dca1594c7bf07176b6a82f6b5153b456d78dc0
91b86b5b780eb6c090c89eebd405f3487147e6da
[]
[]
"2022-09-01T01:56:32Z"
[ "type/bug", "good first issue", "scope/frontend", "status/accepted", "status/confirmed" ]
The 'e' letter allowed to paste into number fields within Create a topic
**Describe the bug** the 'e' letter allowed to paste into number fields within Create a topic **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** Steps to reproduce the behavior: 1. Navigate to Topics 2. Add a Topic 3. Copy and paste word including 'e' letter into number fields **Expected behavior** Letters should not be allowed for number fields
[ "kafka-ui-react-app/src/components/common/Input/Input.tsx" ]
[ "kafka-ui-react-app/src/components/common/Input/Input.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/common/Input/Input.tsx b/kafka-ui-react-app/src/components/common/Input/Input.tsx index 58a3bb92c85..d6c6416bfe7 100644 --- a/kafka-ui-react-app/src/components/common/Input/Input.tsx +++ b/kafka-ui-react-app/src/components/common/Input/Input.tsx @@ -17,6 +17,7 @@ const Input: React.FC<InputProps> = ({ hookFormOptions, search, inputSize = 'L', + type, ...rest }) => { const methods = useFormContext(); @@ -28,10 +29,30 @@ const Input: React.FC<InputProps> = ({ inputSize={inputSize} {...methods.register(name, { ...hookFormOptions })} hasLeftIcon={!!search} + type={type} {...rest} + onKeyDown={(e) => { + if (type === 'number') { + if (e.key === 'e') { + e.preventDefault(); + } + } + }} + onPaste={(e) => { + if (type === 'number') { + e.preventDefault(); + const value = e.clipboardData.getData('Text'); + methods.setValue(name, value.replace(/[^\d.]/g, '')); + } + }} /> ) : ( - <S.Input inputSize={inputSize} hasLeftIcon={!!search} {...rest} /> + <S.Input + inputSize={inputSize} + hasLeftIcon={!!search} + type={type} + {...rest} + /> )} </S.Wrapper> );
null
train
train
2022-09-01T15:31:58
"2022-06-17T05:57:21Z"
armenuikafka
train
provectus/kafka-ui/2164_2520
provectus/kafka-ui
provectus/kafka-ui/2164
provectus/kafka-ui/2520
[ "timestamp(timedelta=1.0, similarity=1.0000000000000002)", "connected" ]
9f32abcd09255860ff9c158434be64cade862228
fc946a1dd1d34758eb35c7382595faaac1ab5cc1
[ "@armenuikafka I tried to reproduce this issue, although, I'm getting this error:\r\n![Screen Shot 2022-08-31 at 12 15 57](https://user-images.githubusercontent.com/110129035/187750536-a96905db-6fef-4ea8-b71c-160f01884fce.png)\r\n\r\nCould you confirm if you still get the ugly error using the latest codebase? I wonder if I'm doing something wrong or if it's already fixed.", "@daching-provectus hey, the error still actual on master. \r\n\r\nSteps to reproduse:\r\n- Use the avro schema type\r\n- edit the schema\r\n- delete any property (\"name\")\r\n- submit\r\n\r\n\r\n\r\nhttps://user-images.githubusercontent.com/104780608/187843020-c9469cd4-0044-4fbd-96b3-362dd13a58e0.mov\r\n\r\n\r\n" ]
[]
"2022-09-02T17:00:41Z"
[ "type/enhancement", "good first issue", "scope/backend", "status/accepted" ]
Handle error message for schemas
**Describe the bug** Not user friendly error message appears with submitting wrong data in Schema **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** Steps to reproduce the behavior: 1. Navigate to Schema Registry 2. Open the schema 3. Edit the schema 4. Let any field empty or input wrong data 5. Press submit **Expected behavior** Error message about wrong data should be user friendly to indicate which data is wrong **Screenshots** <img width="1718" alt="error message schema" src="https://user-images.githubusercontent.com/104780608/173566018-8e125a64-c4dc-48e3-ab3c-950f3ef45b51.png">
[ "kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/SchemaRegistryService.java" ]
[ "kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/SchemaRegistryService.java" ]
[]
diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/SchemaRegistryService.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/SchemaRegistryService.java index 92603ba979e..2a3039b3fe4 100644 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/SchemaRegistryService.java +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/SchemaRegistryService.java @@ -66,6 +66,7 @@ public class SchemaRegistryService { private static final String UNRECOGNIZED_FIELD_SCHEMA_TYPE = "Unrecognized field: schemaType"; private static final String INCOMPATIBLE_WITH_AN_EARLIER_SCHEMA = "incompatible with an earlier schema"; + private static final String INVALID_SCHEMA = "Invalid Schema"; private final WebClient webClient; @@ -237,7 +238,8 @@ private Mono<Throwable> getMonoError(ErrorResponse x) { } else if (isIncompatibleSchemaMessage(x.getMessage())) { return Mono.error(new SchemaCompatibilityException(x.getMessage())); } else { - return Mono.error(new UnprocessableEntityException(x.getMessage())); + log.error(x.getMessage()); + return Mono.error(new UnprocessableEntityException(INVALID_SCHEMA)); } }
null
train
train
2022-09-15T02:38:05
"2022-06-14T11:32:31Z"
armenuikafka
train
provectus/kafka-ui/1995_2567
provectus/kafka-ui
provectus/kafka-ui/1995
provectus/kafka-ui/2567
[ "keyword_pr_to_issue", "timestamp(timedelta=148686.0, similarity=0.9338459454047882)" ]
df2b2e01deb1f7896e634c53ef8082c87978e472
ed8b84b4148680cc13b5dc3d1bbbfe3ae4acca66
[ "Hello there lbalcerek! πŸ‘‹\n\nThank you and congratulations πŸŽ‰ for opening your very first issue in this project! πŸ’–\n\nIn case you want to claim this issue, please comment down below! We will try to get back to you as soon as we can. πŸ‘€", "Hey, thanks for reaching out. We'll take a look.", "Steps to reproduce:\r\n1. `provectuslabs/kafka-ui:0.3.0`, create a schema with name \"test/test\"\r\n2. Get a list of schemas. It works fine.\r\n3. Do the same with `provectuslabs/kafka-ui:0.4.0`, request to fetch schemas list fails.\r\n", "Seems the problem is actual on master, newly created Schema with \"/\" sign in Subject, is not available. See attached screen record please.\r\n\r\nhttps://user-images.githubusercontent.com/104780608/187169410-9f59ad78-6ae2-4cd5-80eb-e1194b01571b.mov\r\n\r\n", "@shubhwip wanna take a look?", "> @shubhwip wanna take a look?\r\n\r\nThis is something i mentioned earlier two times on the frontend PR below :) and also shared a video for exact same problem.\r\nhttps://github.com/provectus/kafka-ui/pull/2483#issuecomment-1225243434\r\nhttps://user-images.githubusercontent.com/23444368/186343542-9be868f5-621b-4a2a-9fa5-e49c686465a0.mov\r\nUnfortunately i don't think i have a solution for this :), I tried before already.", "> Seems the problem is actual on master, newly created Schema with \"/\" sign in Subject, is not available. See attached screen record please.\r\n> \r\n> testSchema.mov\r\n\r\n**Just a note** : If you try to manually replace `/` with `%2F` in the browser tab then it will work however it is not a solution. Just mentioning it here.", "> > @shubhwip wanna take a look?\r\n> \r\n> This is something i mentioned earlier two times on the frontend PR below :) and also shared a video for exact same problem. [#2483 (comment)](https://github.com/provectus/kafka-ui/pull/2483#issuecomment-1225243434) https://user-images.githubusercontent.com/23444368/186343542-9be868f5-621b-4a2a-9fa5-e49c686465a0.mov Unfortunately i don't think i have a solution for this :), I tried before already.\r\n\r\nOh well, sorry, I got a feeling you've fixed this eventually. \r\n", "@Kris-K-Dev PTAL https://github.com/provectus/kafka-ui/pull/2483#issuecomment-1225243434", "Actual on master: newly created schema with \"/\" sign is not available\r\n\r\n<img width=\"1621\" alt=\"schema new\" src=\"https://user-images.githubusercontent.com/104780608/190325765-cc0b3947-68bb-4edf-9fbe-f6b2d7600f88.png\">\r\n", "@armenuikafka Can you please share video for steps you followed for reproducing because I just tried from master and it works.\r\n<img width=\"1440\" alt=\"Screenshot 2022-09-15 at 11 51 39 AM\" src=\"https://user-images.githubusercontent.com/23444368/190331576-a2aaf33d-dba8-4768-a4a1-d329c646dddc.png\">\r\n", "The env this has been checked on is pretty much outdated considering the commit tag on the screenshot :)", "@shubhwip there was a problem which is fixed now and everything works as expected !", "@shubhwip thank you :)" ]
[]
"2022-09-13T00:10:22Z"
[ "type/bug", "good first issue", "scope/backend", "scope/frontend", "status/accepted", "status/confirmed" ]
No URL encoding while getting schemas
**Describe the bug** We have subject with '/' sign in it, what causes exception after choosing 'Schema Registry' option. I can see that app requests for: `GET https://schema-registry.nonprod.ipfdigital.io/subjects/apollo/commons/Header.proto/versions/latest` so without proper URL encoding, and should be: `https://schema-registry.nonprod.ipfdigital.io/subjects/apollo%2fcommons%2fHeader.proto/versions/latest` In version 0.3.x everything works fine, something broke in 0.4. **Screenshots** <!-- (If applicable, add screenshots to help explain your problem) --> ![image](https://user-images.githubusercontent.com/16690739/169000231-d76ad178-8d0a-4d33-840b-c5ccf76fd4de.png) **Exception** `{"code":4009,"message":"No such schema apollo/commons/Header.proto with version latest","timestamp":1652860516935,"requestId":"5d9f795f-81176","fieldsErrors":null,"stackTrace":"com.provectus.kafka.ui.exception.SchemaNotFoundException: No such schema apollo/commons/Header.proto with version latest\n\tat com.provectus.kafka.ui.service.SchemaRegistryService.lambda$throwIfNotFoundStatus$20(SchemaRegistryService.java:238)\n\tSuppressed: The stacktrace has been enhanced by Reactor, refer to additional information below: \nError has been observed at the following site(s):\n\t*__checkpoint β‡’ 404 from GET http://schema-registry-service:8081/subjects/apollo/commons/Header.proto/versions/latest [DefaultWebClient]\n\t*__checkpoint β‡’ Handler com.provectus.kafka.ui.controller.SchemasController#getSchemas(String, Integer, Integer, String, ServerWebExchange) [DispatcherHandler]\n\t*__checkpoint β‡’ com.provectus.kafka.ui.config.ReadOnlyModeFilter [DefaultWebFilterChain]\n\t*__checkpoint β‡’ com.provectus.kafka.ui.config.CustomWebFilter [DefaultWebFilterChain]\n\t*__checkpoint β‡’ org.springframework.security.web.server.authorization.AuthorizationWebFilter [DefaultWebFilterChain]\n\t*__checkpoint β‡’ org.springframework.security.web.server.authorization.ExceptionTranslationWebFilter [DefaultWebFilterChain]\n\t*__checkpoint β‡’ org.springframework.security.web.server.authentication.logout.LogoutWebFilter [DefaultWebFilterChain]\n\t*__checkpoint β‡’ org.springframework.security.web.server.savedrequest.ServerRequestCacheWebFilter [DefaultWebFilterChain]\n\t*__checkpoint β‡’ org.springframework.security.web.server.context.SecurityContextServerWebExchangeWebFilter [DefaultWebFilterChain]\n\t*__checkpoint β‡’ org.springframework.security.web.server.context.ReactorContextWebFilter [DefaultWebFilterChain]\n\t*__checkpoint β‡’ org.springframework.security.web.server.header.HttpHeaderWriterWebFilter [DefaultWebFilterChain]\n\t*__checkpoint β‡’ org.springframework.security.config.web.server.ServerHttpSecurity$ServerWebExchangeReactorContextWebFilter [DefaultWebFilterChain]\n\t*__checkpoint β‡’ org.springframework.security.web.server.WebFilterChainProxy [DefaultWebFilterChain]\n\t*__checkpoint β‡’ org.springframework.boot.actuate.metrics.web.reactive.server.MetricsWebFilter [DefaultWebFilterChain]\n\t*__checkpoint β‡’ HTTP GET \"/api/clusters/nonprod-msk-cluster/schemas\" [ExceptionHandlingWebHandler]\nOriginal Stack Trace:\n\t\tat com.provectus.kafka.ui.service.SchemaRegistryService.lambda$throwIfNotFoundStatus$20(SchemaRegistryService.java:238)\n\t\tat org.springframework.web.reactive.function.client.DefaultWebClient$DefaultResponseSpec$StatusHandler.apply(DefaultWebClient.java:695)\n\t\tat org.springframework.web.reactive.function.client.DefaultWebClient$DefaultResponseSpec.applyStatusHandlers(DefaultWebClient.java:654)\n\t\tat org.springframework.web.reactive.function.client.DefaultWebClient$DefaultResponseSpec.handleBodyMono(DefaultWebClient.java:623)\n\t\tat org.springframework.web.reactive.function.client.DefaultWebClient$DefaultResponseSpec.lambda$bodyToMono$2(DefaultWebClient.java:541)\n\t\tat reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:125)\n\t\tat reactor.core.publisher.FluxSwitchIfEmpty$SwitchIfEmptySubscriber.onNext(FluxSwitchIfEmpty.java:74)\n\t\tat reactor.core.publisher.FluxMap$MapSubscriber.onNext(FluxMap.java:120)\n\t\tat reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onNext(FluxOnErrorResume.java:79)\n\t\tat reactor.core.publisher.FluxPeek$PeekSubscriber.onNext(FluxPeek.java:200)\n\t\tat reactor.core.publisher.FluxPeek$PeekSubscriber.onNext(FluxPeek.java:200)\n\t\tat reactor.core.publisher.FluxPeek$PeekSubscriber.onNext(FluxPeek.java:200)\n\t\tat reactor.core.publisher.MonoNext$NextSubscriber.onNext(MonoNext.java:82)\n\t\tat reactor.core.publisher.Operators$ScalarSubscription.request(Operators.java:2398)\n\t\tat reactor.core.publisher.MonoFlatMapMany$FlatMapManyMain.onSubscribeInner(MonoFlatMapMany.java:150)\n\t\tat reactor.core.publisher.MonoFlatMapMany$FlatMapManyMain.onNext(MonoFlatMapMany.java:189)\n\t\tat reactor.core.publisher.SerializedSubscriber.onNext(SerializedSubscriber.java:99)\n\t\tat reactor.core.publisher.FluxRetryWhen$RetryWhenMainSubscriber.onNext(FluxRetryWhen.java:174)\n\t\tat reactor.core.publisher.MonoCreate$DefaultMonoSink.success(MonoCreate.java:165)\n\t\tat reactor.netty.http.client.HttpClientConnect$HttpIOHandlerObserver.onStateChange(HttpClientConnect.java:414)\n\t\tat reactor.netty.ReactorNetty$CompositeConnectionObserver.onStateChange(ReactorNetty.java:671)\n\t\tat reactor.netty.resources.DefaultPooledConnectionProvider$DisposableAcquire.onStateChange(DefaultPooledConnectionProvider.java:183)\n\t\tat reactor.netty.resources.DefaultPooledConnectionProvider$PooledConnection.onStateChange(DefaultPooledConnectionProvider.java:439)\n\t\tat reactor.netty.http.client.HttpClientOperations.onInboundNext(HttpClientOperations.java:637)\n\t\tat reactor.netty.channel.ChannelOperationsHandler.channelRead(ChannelOperationsHandler.java:93)\n\t\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)\n\t\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)\n\t\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)\n\t\tat io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)\n\t\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)\n\t\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)\n\t\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)\n\t\tat io.netty.channel.CombinedChannelDuplexHandler$DelegatingChannelHandlerContext.fireChannelRead(CombinedChannelDuplexHandler.java:436)\n\t\tat io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:327)\n\t\tat io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:314)\n\t\tat io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:435)\n\t\tat io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:279)\n\t\tat io.netty.channel.CombinedChannelDuplexHandler.channelRead(CombinedChannelDuplexHandler.java:251)\n\t\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)\n\t\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)\n\t\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)\n\t\tat io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)\n\t\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)\n\t\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)\n\t\tat io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)\n\t\tat io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:795)\n\t\tat io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:480)\n\t\tat io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:378)\n\t\tat io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)\n\t\tat io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)\n\t\tat io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)\n\t\tat java.base/java.lang.Thread.run(Thread.java:830)\n"}`
[ "kafka-ui-react-app/src/components/common/NewTable/LinkCell.tsx" ]
[ "kafka-ui-react-app/src/components/common/NewTable/LinkCell.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/common/NewTable/LinkCell.tsx b/kafka-ui-react-app/src/components/common/NewTable/LinkCell.tsx index aa6b2e826ce..b6ca656d1d1 100644 --- a/kafka-ui-react-app/src/components/common/NewTable/LinkCell.tsx +++ b/kafka-ui-react-app/src/components/common/NewTable/LinkCell.tsx @@ -7,7 +7,7 @@ const LinkCell: React.FC<CellContext<any, unknown>> = ({ getValue }) => { const value = `${getValue<string | number>()}`; const handleClick: React.MouseEventHandler = (e) => e.stopPropagation(); return ( - <NavLink to={value} title={value} onClick={handleClick}> + <NavLink to={encodeURIComponent(value)} title={value} onClick={handleClick}> {value} </NavLink> );
null
train
train
2022-09-14T11:16:28
"2022-05-18T09:00:26Z"
lbalcerek
train
provectus/kafka-ui/1933_2583
provectus/kafka-ui
provectus/kafka-ui/1933
provectus/kafka-ui/2583
[ "connected" ]
596f4233fcd4f7ace80141f5e3b5e1f3ed640be5
7db55d5acf29d1daab488ea959e40ff143054699
[ "`{\"git\":{\"commit\":{\"id\":\"c8306f5\"}},\"build\":{\"artifact\":\"kafka-ui-api\",\"name\":\"kafka-ui-api\",\"time\":\"2022-09-15T02:31:12.047Z\",\"version\":\"0.0.1-SNAPSHOT\",\"group\":\"com.provectus\"}}`\r\n#2583", "7th point is unclear", "> 7th point is unclear\r\n\r\na short commit contains a link, let's keep it" ]
[]
"2022-09-15T02:33:05Z"
[ "type/enhancement", "scope/backend", "scope/frontend", "status/accepted" ]
Display build date instead of full commit hash in version info
1. Remove VITE_TAG 1. Remove VITE_COMMIT 1. fetch info from Spring API `/actuator/info` `{commit, tag, buildDate: timestamp}` 1. If tag is a version (v0.5.0) fetch latest version from github and compare with tag. 6. If tag is a commit we need to display formatted timestamp (`/api/info/timestampformat/iso`) 7. if tag contains `-SNAPSHOT` - display formatted timestamp 9. Keep the short one with the link
[ "kafka-ui-api/pom.xml" ]
[ "kafka-ui-api/pom.xml" ]
[]
diff --git a/kafka-ui-api/pom.xml b/kafka-ui-api/pom.xml index a43c087b428..0cee2260f90 100644 --- a/kafka-ui-api/pom.xml +++ b/kafka-ui-api/pom.xml @@ -231,6 +231,7 @@ <execution> <goals> <goal>repackage</goal> + <goal>build-info</goal> </goals> </execution> </executions>
null
train
train
2022-09-19T14:32:47
"2022-05-08T00:26:58Z"
Haarolean
train
provectus/kafka-ui/2584_2591
provectus/kafka-ui
provectus/kafka-ui/2584
provectus/kafka-ui/2591
[ "connected" ]
15d09751a6127af28f7d01576967d486d57e5f01
72dd7127f8496f76264e2540faba0798551ca997
[]
[ "no need to do this way. you can call method from this class using 'new SchemaEditView()'", "'private static final' - really??", "the best practice in this case to use 'protected' modifier only", "Done)", "corrected " ]
"2022-09-15T13:31:56Z"
[ "scope/QA" ]
[e2e] Need to update com.provectus.kafka.ui.tests.SchemasTests#updateSchemaAvro
According to https://github.com/provectus/kafka-ui/issues/2152#issuecomment-1247454408 flow was updated
[ "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaEditView.java" ]
[ "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaEditView.java" ]
[ "kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/SchemasTests.java" ]
diff --git a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaEditView.java b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaEditView.java index b4e2ca419e6..716e7ef80c1 100644 --- a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaEditView.java +++ b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaEditView.java @@ -14,7 +14,8 @@ public class SchemaEditView { - SelenideElement newSchemaTextArea = $("#newSchema [wrap]"); + protected SelenideElement newSchemaTextArea = $("#newSchema [wrap]"); + protected SelenideElement schemaTypeDropDown = $x("//ul[@name='schemaType']"); @Step public SchemaEditView selectSchemaTypeFromDropdown(SchemaCreateView.SchemaType schemaType) { @@ -49,4 +50,16 @@ public SchemaRegistryList removeSchema() { $(By.xpath("//*[text()='Confirm']")).shouldBe(Condition.visible).click(); return new SchemaRegistryList(); } + + @Step + public boolean isSchemaDropDownDisabled(){ + boolean disabled = false; + try{ + String attribute = schemaTypeDropDown.getAttribute("disabled"); + disabled = true; + } + catch (Throwable ignored){ + } + return disabled; + } }
diff --git a/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/SchemasTests.java b/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/SchemasTests.java index da14264ec39..22c8db453d9 100644 --- a/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/SchemasTests.java +++ b/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/SchemasTests.java @@ -6,6 +6,7 @@ import com.provectus.kafka.ui.helpers.Helpers; import com.provectus.kafka.ui.pages.MainPage; import com.provectus.kafka.ui.pages.schema.SchemaCreateView; +import com.provectus.kafka.ui.pages.schema.SchemaEditView; import com.provectus.kafka.ui.utils.qaseIO.Status; import com.provectus.kafka.ui.utils.qaseIO.annotation.AutomationStatus; import com.provectus.kafka.ui.utils.qaseIO.annotation.Suite; @@ -82,8 +83,9 @@ void updateSchemaAvro() { .goToSideMenu(CLUSTER_NAME, MainPage.SideMenuOptions.SCHEMA_REGISTRY); pages.schemaRegistry.openSchema(SCHEMA_AVRO_API_UPDATE) .waitUntilScreenReady() - .openEditSchema() - .selectCompatibilityLevelFromDropdown(CompatibilityLevel.CompatibilityEnum.NONE) + .openEditSchema(); + Assertions.assertTrue(new SchemaEditView().isSchemaDropDownDisabled(),"isSchemaDropDownDisabled()"); + new SchemaEditView().selectCompatibilityLevelFromDropdown(CompatibilityLevel.CompatibilityEnum.NONE) .setNewSchemaValue(fileToString(PATH_AVRO_FOR_UPDATE)) .clickSubmit() .waitUntilScreenReady()
test
train
2022-09-16T03:37:28
"2022-09-15T06:11:50Z"
VladSenyuta
train
provectus/kafka-ui/1932_2594
provectus/kafka-ui
provectus/kafka-ui/1932
provectus/kafka-ui/2594
[ "timestamp(timedelta=1.0, similarity=0.9999999999999998)", "connected" ]
d149d26013a81e22e5d0c8b59db56ac0565b9e55
71aa44a3d38edf8cc3976c3f4f74f68ee2a6f3a9
[ "Builds fine with \r\n`./mvnw -Dmaven.test.skip=true spring-boot:build-image -Dspring-boot.build-image.imageName=provectuslabs/kafka-ui -Dspring-boot.build-image.builder=paketobuildpacks/builder:tiny`\r\nand jdk17.\r\nThe only problem is image size. Need to investigate.", "During the research decided to postpone using spring-boot (paketo) due to complicated process of adding plain apk dependencies (we have some in dockerfile).\r\nFabric8 plugin looks almost identical to the one we currently have, so inclining to that choice." ]
[]
"2022-09-15T21:37:53Z"
[ "scope/backend", "status/accepted", "type/chore" ]
Get rid of dockerfile-maven-plugin
It's deprecated and doesn't work on M1. Alternatives: - spring-boot:build-image (via paketo) (preferred) - jib (not so much) - fabric8 - maven-exec-plugin (lol) spring: 1) add apt packages via [apt buildpack](https://github.com/fagiani/apt-buildpack)
[ "kafka-ui-api/pom.xml", "pom.xml" ]
[ "kafka-ui-api/pom.xml", "pom.xml" ]
[]
diff --git a/kafka-ui-api/pom.xml b/kafka-ui-api/pom.xml index 5c00dc0f6d9..14c2067886c 100644 --- a/kafka-ui-api/pom.xml +++ b/kafka-ui-api/pom.xml @@ -443,11 +443,22 @@ </executions> </plugin> <plugin> - <groupId>com.spotify</groupId> - <artifactId>dockerfile-maven-plugin</artifactId> - <version>${dockerfile-maven-plugin.version}</version> + <groupId>io.fabric8</groupId> + <artifactId>docker-maven-plugin</artifactId> + <version>${fabric8-maven-plugin.version}</version> <configuration> - <skipPush>true</skipPush> + <verbose>true</verbose> + <images> +  + </images> </configuration> <executions> <execution> @@ -456,14 +467,6 @@ <goals> <goal>build</goal> </goals> - <configuration> - <tag>${git.revision}</tag> - <repository>provectuslabs/kafka-ui</repository> - <buildArgs> - <JAR_FILE>${project.build.finalName}.jar</JAR_FILE> - <JAR_NAME>${project.artifactId}.jar</JAR_NAME> - </buildArgs> - </configuration> </execution> </executions> </plugin> diff --git a/pom.xml b/pom.xml index c06d556132d..513752c85f7 100644 --- a/pom.xml +++ b/pom.xml @@ -23,7 +23,7 @@ <kafka-clients.version>3.2.0</kafka-clients.version> <node.version>v16.15.0</node.version> <pnpm.version>v7.4.0</pnpm.version> - <dockerfile-maven-plugin.version>1.4.10</dockerfile-maven-plugin.version> + <fabric8-maven-plugin.version>0.40.2</fabric8-maven-plugin.version> <frontend-maven-plugin.version>1.12.1</frontend-maven-plugin.version> <maven-compiler-plugin.version>3.8.1</maven-compiler-plugin.version> <maven-clean-plugin.version>3.1.0</maven-clean-plugin.version>
null
train
train
2022-09-15T13:25:49
"2022-05-07T22:57:49Z"
Haarolean
train
provectus/kafka-ui/2527_2596
provectus/kafka-ui
provectus/kafka-ui/2527
provectus/kafka-ui/2596
[ "timestamp(timedelta=1.0, similarity=0.9901141870449481)", "connected" ]
72dd7127f8496f76264e2540faba0798551ca997
fa8c03a664fea67ee92b0a38db1f0c20992566ad
[ "A thought, change it to \"Broker Count\" maybe? :)" ]
[]
"2022-09-16T07:42:22Z"
[ "good first issue", "scope/frontend", "status/accepted", "type/chore" ]
Rename "Total Broker" to "Broker Count"
**Describe the bug** Rename "Total Broker" to "Broker Count" **Set up** https://www.kafka-ui.provectus.io/ui/clusters/local/brokers **Screenshots** <img width="1711" alt="total brokers" src="https://user-images.githubusercontent.com/104780608/188552969-4cf84295-e120-4827-acdc-51106e41ee33.png">
[ "kafka-ui-react-app/src/components/Brokers/BrokersList/BrokersList.tsx" ]
[ "kafka-ui-react-app/src/components/Brokers/BrokersList/BrokersList.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/Brokers/BrokersList/BrokersList.tsx b/kafka-ui-react-app/src/components/Brokers/BrokersList/BrokersList.tsx index 0cefb30fb04..9f63be31c29 100644 --- a/kafka-ui-react-app/src/components/Brokers/BrokersList/BrokersList.tsx +++ b/kafka-ui-react-app/src/components/Brokers/BrokersList/BrokersList.tsx @@ -63,7 +63,7 @@ const BrokersList: React.FC = () => { <PageHeading text="Brokers" /> <Metrics.Wrapper> <Metrics.Section title="Uptime"> - <Metrics.Indicator label="Total Broker"> + <Metrics.Indicator label="Broker Count"> {brokerCount} </Metrics.Indicator> <Metrics.Indicator label="Active Controllers">
null
train
train
2022-09-16T10:26:13
"2022-09-06T05:22:54Z"
armenuikafka
train
provectus/kafka-ui/2533_2597
provectus/kafka-ui
provectus/kafka-ui/2533
provectus/kafka-ui/2597
[ "timestamp(timedelta=1.0, similarity=1.0)", "connected" ]
e300aa7d197fc7a3698e9f31b24c33ea18f5818d
6d448c032218fbb000b71fdc7855fa5fcde62fd6
[ "Is this issue taken? If not, I want to do it.", "No, it’s not. Please go ahead:)\n\n> On 15 Sep 2022, at 18:35, Winnie Chiu ***@***.***> wrote:\n> \n> ο»Ώ\n> Is this issue taken? If not, I want to do it.\n> \n> β€”\n> Reply to this email directly, view it on GitHub, or unsubscribe.\n> You are receiving this because you are subscribed to this thread.\n" ]
[]
"2022-09-16T09:47:45Z"
[ "good first issue", "scope/frontend", "status/accepted", "type/chore" ]
Rename "Relevant" version of a schema to "Actual"
**Describe the bug** Update "Relevant" version of Schema to be "Actual" **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** 1. Navigate to Schema Registry 2. Open the Schema **Expected behavior** Should be 'actual version" instead of "Relevant version" **Screenshots** <img width="1717" alt="relevant version" src="https://user-images.githubusercontent.com/104780608/188567076-82590b28-f3f0-49ca-b65a-cc0862b8ceae.png"> **Additional context** <!-- (Add any other context about the problem here) -->
[ "kafka-ui-react-app/src/components/Schemas/Details/LatestVersion/LatestVersionItem.tsx", "kafka-ui-react-app/src/components/Schemas/Details/__test__/LatestVersionItem.spec.tsx" ]
[ "kafka-ui-react-app/src/components/Schemas/Details/LatestVersion/LatestVersionItem.tsx", "kafka-ui-react-app/src/components/Schemas/Details/__test__/LatestVersionItem.spec.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/Schemas/Details/LatestVersion/LatestVersionItem.tsx b/kafka-ui-react-app/src/components/Schemas/Details/LatestVersion/LatestVersionItem.tsx index 4efa47d7fe7..5cc472d9f0f 100644 --- a/kafka-ui-react-app/src/components/Schemas/Details/LatestVersion/LatestVersionItem.tsx +++ b/kafka-ui-react-app/src/components/Schemas/Details/LatestVersion/LatestVersionItem.tsx @@ -14,7 +14,7 @@ const LatestVersionItem: React.FC<LatestVersionProps> = ({ }) => ( <S.Wrapper> <div> - <Heading level={3}>Relevant version</Heading> + <Heading level={3}>Actual version</Heading> <EditorViewer data={schema} schemaType={schemaType} maxLines={28} /> </div> <div> diff --git a/kafka-ui-react-app/src/components/Schemas/Details/__test__/LatestVersionItem.spec.tsx b/kafka-ui-react-app/src/components/Schemas/Details/__test__/LatestVersionItem.spec.tsx index 9962bd2e966..e4d70548cad 100644 --- a/kafka-ui-react-app/src/components/Schemas/Details/__test__/LatestVersionItem.spec.tsx +++ b/kafka-ui-react-app/src/components/Schemas/Details/__test__/LatestVersionItem.spec.tsx @@ -8,7 +8,7 @@ import { jsonSchema, protoSchema } from './fixtures'; describe('LatestVersionItem', () => { it('renders latest version of json schema', () => { render(<LatestVersionItem schema={jsonSchema} />); - expect(screen.getByText('Relevant version')).toBeInTheDocument(); + expect(screen.getByText('Actual version')).toBeInTheDocument(); expect(screen.getByText('Latest version')).toBeInTheDocument(); expect(screen.getByText('ID')).toBeInTheDocument(); expect(screen.getByText('Subject')).toBeInTheDocument(); @@ -18,7 +18,7 @@ describe('LatestVersionItem', () => { it('renders latest version of compatibility', () => { render(<LatestVersionItem schema={protoSchema} />); - expect(screen.getByText('Relevant version')).toBeInTheDocument(); + expect(screen.getByText('Actual version')).toBeInTheDocument(); expect(screen.getByText('Latest version')).toBeInTheDocument(); expect(screen.getByText('ID')).toBeInTheDocument(); expect(screen.getByText('Subject')).toBeInTheDocument();
null
train
train
2022-09-16T16:34:47
"2022-09-06T06:56:02Z"
armenuikafka
train
provectus/kafka-ui/2353_2599
provectus/kafka-ui
provectus/kafka-ui/2353
provectus/kafka-ui/2599
[ "keyword_pr_to_issue", "timestamp(timedelta=1.0, similarity=1.0000000000000002)" ]
b3240d9057d5470dda93d56bb3e0145a768662eb
94da2f4e7f98f969df7a3509bfff41850fef71e3
[ "Considering #2354 it seems like we have validation bound to producing messages. There should be one for creating/updating schemas as well." ]
[ "I don't think we need this `eslint-disable` here", "@Haarolean pls help us with error message", "```suggestion\r\n 'Schema syntax is not valid',\r\n```", "@habrahamyanpro is it a generic message for all the errors? If we'd stick with \"invalid data\" message, that's gonna be ambiguous. ", "Pls check this test again and fix ", "same", "```suggestion\r\n config: yup.string().required().isValidJsonObject(),\r\n```", "i think this check `e` could make invalid values true like `eWrong = {}`\r\n`\r\ntrimmedValue.indexOf('enum') === 0`", "i replace 'e' with 'enum', also added test cases " ]
"2022-09-16T14:18:30Z"
[ "type/enhancement", "good first issue", "scope/frontend", "status/accepted" ]
SR: Display a warning in case of invalid syntax
**Describe the bug** It's possible to submit data filled not in {} in case of Schemas editing **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** <!-- We'd like you to provide an example setup (via docker-compose, helm, etc.) to reproduce the problem, especially with a complex setups. --> Steps to reproduce the behavior: 1. Navigate to Schema Registry 2. Select the Schema 3. Edit the Schema 4. Add the data after {} 5. Press "Submit" **Expected behavior** Warning should appear to say that invalid data was filled or should not be possible to submit if data added not in {} scope **Screenshots** <img width="1726" alt="invalid data within json" src="https://user-images.githubusercontent.com/104780608/181295973-3e1a4e01-cb7e-4e54-9810-22a577479758.png">
[ "kafka-ui-react-app/src/components/Schemas/Edit/Form.tsx", "kafka-ui-react-app/src/lib/__test__/yupExtended.spec.ts", "kafka-ui-react-app/src/lib/yupExtended.ts" ]
[ "kafka-ui-react-app/src/components/Schemas/Edit/Form.tsx", "kafka-ui-react-app/src/lib/__test__/yupExtended.spec.ts", "kafka-ui-react-app/src/lib/yupExtended.ts" ]
[]
diff --git a/kafka-ui-react-app/src/components/Schemas/Edit/Form.tsx b/kafka-ui-react-app/src/components/Schemas/Edit/Form.tsx index 9ce7f280f43..2fce1ad7d79 100644 --- a/kafka-ui-react-app/src/components/Schemas/Edit/Form.tsx +++ b/kafka-ui-react-app/src/components/Schemas/Edit/Form.tsx @@ -10,6 +10,7 @@ import { clusterSchemasPath, ClusterSubjectParam, } from 'lib/paths'; +import yup from 'lib/yupExtended'; import { NewSchemaSubjectRaw } from 'redux/interfaces'; import Editor from 'components/common/Editor/Editor'; import Select from 'components/common/Select/Select'; @@ -28,6 +29,9 @@ import { import PageLoader from 'components/common/PageLoader/PageLoader'; import { schemasApiClient } from 'lib/api'; import { showServerError } from 'lib/errorHandling'; +import { yupResolver } from '@hookform/resolvers/yup'; +import { FormError } from 'components/common/Input/Input.styled'; +import { ErrorMessage } from '@hookform/error-message'; import * as S from './Edit.styled'; @@ -47,8 +51,16 @@ const Form: React.FC = () => { : JSON.stringify(JSON.parse(schema?.schema || '{}'), null, '\t'); }, [schema]); + const validationSchema = () => + yup.object().shape({ + newSchema: + schema?.schemaType === SchemaType.PROTOBUF + ? yup.string().required().isEnum('Schema syntax is not valid') + : yup.string().required().isJsonObject('Schema syntax is not valid'), + }); const methods = useForm<NewSchemaSubjectRaw>({ mode: 'onChange', + resolver: yupResolver(validationSchema()), defaultValues: { schemaType: schema?.schemaType, compatibilityLevel: @@ -58,11 +70,10 @@ const Form: React.FC = () => { }); const { - formState: { isDirty, isSubmitting, dirtyFields }, + formState: { isDirty, isSubmitting, dirtyFields, errors }, control, handleSubmit, } = methods; - const onSubmit = async (props: NewSchemaSubjectRaw) => { if (!schema) return; @@ -191,11 +202,14 @@ const Form: React.FC = () => { )} /> </S.EditorContainer> + <FormError> + <ErrorMessage errors={errors} name="newSchema" /> + </FormError> <Button buttonType="primary" buttonSize="M" type="submit" - disabled={!isDirty || isSubmitting} + disabled={!isDirty || isSubmitting || !!errors.newSchema} > Submit </Button> diff --git a/kafka-ui-react-app/src/lib/__test__/yupExtended.spec.ts b/kafka-ui-react-app/src/lib/__test__/yupExtended.spec.ts index bd43dd3f72a..8100b9a3264 100644 --- a/kafka-ui-react-app/src/lib/__test__/yupExtended.spec.ts +++ b/kafka-ui-react-app/src/lib/__test__/yupExtended.spec.ts @@ -1,5 +1,19 @@ -import { isValidJsonObject } from 'lib/yupExtended'; +import { isValidEnum, isValidJsonObject } from 'lib/yupExtended'; +const invalidEnum = ` +ennum SchemType { + AVRO = 0; + JSON = 1; + PROTOBUF = 3; +} +`; +const validEnum = ` +enum SchemType { + AVRO = 0; + JSON = 1; + PROTOBUF = 3; +} +`; describe('yup extended', () => { describe('isValidJsonObject', () => { it('returns false for no value', () => { @@ -21,4 +35,21 @@ describe('yup extended', () => { expect(isValidJsonObject('{ "foo": "bar" }')).toBeTruthy(); }); }); + + describe('isValidEnum', () => { + it('returns false for invalid enum', () => { + expect(isValidEnum(invalidEnum)).toBeFalsy(); + }); + it('returns false for no value', () => { + expect(isValidEnum()).toBeFalsy(); + }); + it('returns true should trim value', () => { + expect( + isValidEnum(` enum SchemType {AVRO = 0; PROTOBUF = 3;} `) + ).toBeTruthy(); + }); + it('returns true for valid enum', () => { + expect(isValidEnum(validEnum)).toBeTruthy(); + }); + }); }); diff --git a/kafka-ui-react-app/src/lib/yupExtended.ts b/kafka-ui-react-app/src/lib/yupExtended.ts index 9c96e073db5..4c662ca8222 100644 --- a/kafka-ui-react-app/src/lib/yupExtended.ts +++ b/kafka-ui-react-app/src/lib/yupExtended.ts @@ -9,7 +9,8 @@ declare module 'yup' { TDefault = undefined, TFlags extends yup.Flags = '' > extends yup.Schema<TType, TContext, TDefault, TFlags> { - isJsonObject(): StringSchema<TType, TContext>; + isJsonObject(message?: string): StringSchema<TType, TContext>; + isEnum(message?: string): StringSchema<TType, TContext>; } } @@ -31,15 +32,40 @@ export const isValidJsonObject = (value?: string) => { return false; }; -const isJsonObject = () => { +const isJsonObject = (message?: string) => { return yup.string().test( 'isJsonObject', // eslint-disable-next-line no-template-curly-in-string - '${path} is not JSON object', + message || '${path} is not JSON object', isValidJsonObject ); }; +export const isValidEnum = (value?: string) => { + try { + if (!value) return false; + const trimmedValue = value.trim(); + if ( + trimmedValue.indexOf('enum') === 0 && + trimmedValue.lastIndexOf('}') === trimmedValue.length - 1 + ) { + return true; + } + } catch { + // do nothing + } + return false; +}; + +const isEnum = (message?: string) => { + return yup.string().test( + 'isEnum', + // eslint-disable-next-line no-template-curly-in-string + message || '${path} is not Enum object', + isValidEnum + ); +}; + /** * due to yup rerunning all the object validiation during any render, * it makes sense to cache the async results @@ -62,6 +88,7 @@ export function cacheTest( } yup.addMethod(yup.StringSchema, 'isJsonObject', isJsonObject); +yup.addMethod(yup.StringSchema, 'isEnum', isEnum); export const topicFormValidationSchema = yup.object().shape({ name: yup
null
val
train
2023-04-10T13:06:15
"2022-07-27T16:09:29Z"
armenuikafka
train
provectus/kafka-ui/2550_2604
provectus/kafka-ui
provectus/kafka-ui/2550
provectus/kafka-ui/2604
[ "timestamp(timedelta=1.0, similarity=1.0000000000000002)", "connected" ]
6d448c032218fbb000b71fdc7855fa5fcde62fd6
3bfdc98cc962560df5bb001bb2a63e0222d4bc9f
[ "I like to work on this issue." ]
[]
"2022-09-18T11:53:12Z"
[ "good first issue", "scope/frontend", "status/accepted", "type/chore" ]
Change the "Back to custom filters" link text within Saved filters
**Describe the bug** Change the "Back to custom filters" link text to be "Back to create filter" within Saved filters of Topic/Messages tab **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** <!-- We'd like you to provide an example setup (via docker-compose, helm, etc.) to reproduce the problem, especially with a complex setups. --> Steps to reproduce the behavior: 1. Navigate to Topic/Messages tab 2. Add filter 3. Press Saved filters **Expected behavior** The "Back to custom filters" link text should be "Back to create filter" **Screenshots** <img width="1437" alt="back to create" src="https://user-images.githubusercontent.com/104780608/189111269-3884c52f-9c71-458a-a021-44a4adcd499f.png">
[ "kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/SavedFilters.tsx", "kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/__tests__/SavedFilters.spec.tsx" ]
[ "kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/SavedFilters.tsx", "kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/__tests__/SavedFilters.spec.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/SavedFilters.tsx b/kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/SavedFilters.tsx index 26036ae9aea..6a757d34383 100644 --- a/kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/SavedFilters.tsx +++ b/kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/SavedFilters.tsx @@ -58,7 +58,7 @@ const SavedFilters: FC<Props> = ({ return ( <> <S.BackToCustomText onClick={onGoBack}> - Back To custom filters + Back To create filters </S.BackToCustomText> <S.SavedFiltersContainer> <S.CreatedFilter>Saved filters</S.CreatedFilter> diff --git a/kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/__tests__/SavedFilters.spec.tsx b/kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/__tests__/SavedFilters.spec.tsx index e8b9178e54a..cac3bee8ec6 100644 --- a/kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/__tests__/SavedFilters.spec.tsx +++ b/kafka-ui-react-app/src/components/Topics/Topic/Messages/Filters/__tests__/SavedFilters.spec.tsx @@ -45,7 +45,7 @@ describe('SavedFilter Component', () => { it('should check on go back button click', () => { const onGoBackMock = jest.fn(); setUpComponent({ onGoBack: onGoBackMock }); - userEvent.click(screen.getByText(/back to custom filters/i)); + userEvent.click(screen.getByText(/back to create filters/i)); expect(onGoBackMock).toHaveBeenCalled(); });
null
train
train
2022-09-17T19:42:36
"2022-09-08T11:29:47Z"
armenuikafka
train
provectus/kafka-ui/2344_2605
provectus/kafka-ui
provectus/kafka-ui/2344
provectus/kafka-ui/2605
[ "timestamp(timedelta=1.0, similarity=0.9174110017148175)", "connected" ]
3b8cbd1dbf92300e795658c3496378f4abd25263
596f4233fcd4f7ace80141f5e3b5e1f3ed640be5
[]
[]
"2022-09-19T00:54:59Z"
[ "type/bug", "good first issue", "scope/backend", "status/accepted", "status/confirmed" ]
Configure ldap beans manually
Due to this exception: ``` org.springframework.ldap.CommunicationException: localhost:389; nested exception is javax.naming.CommunicationException: localhost:389 [Root exception is java.net.ConnectException: Connection refused] at org.springframework.ldap.support.LdapUtils.convertLdapException(LdapUtils.java:108) at org.springframework.ldap.core.support.AbstractContextSource.createContext(AbstractContextSource.java:362) at org.springframework.ldap.core.support.AbstractContextSource.getReadOnlyContext(AbstractContextSource.java:171) at org.springframework.ldap.core.LdapTemplate.executeReadOnly(LdapTemplate.java:806) at org.springframework.boot.actuate.ldap.LdapHealthIndicator.doHealthCheck(LdapHealthIndicator.java:50) at org.springframework.boot.actuate.health.AbstractHealthIndicator.health(AbstractHealthIndicator.java:82) at reactor.core.publisher.MonoCallable.call(MonoCallable.java:92) at reactor.core.publisher.FluxSubscribeOnCallable$CallableSubscribeOnSubscription.run(FluxSubscribeOnCallable.java:227) at reactor.core.scheduler.SchedulerTask.call(SchedulerTask.java:68) at reactor.core.scheduler.SchedulerTask.call(SchedulerTask.java:28) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:830) ``` We have to disable ldap healthchecks: `MANAGEMENT_HEALTH_LDAP_ENABLED: "FALSE" ` Instead, let's configure beans when needed instead of using starter: ``` <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-data-ldap</artifactId> </dependency> ```
[ "kafka-ui-api/pom.xml", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/KafkaUiApplication.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/LdapSecurityConfig.java", "kafka-ui-api/src/main/resources/application.yml" ]
[ "kafka-ui-api/pom.xml", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/KafkaUiApplication.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/LdapSecurityConfig.java", "kafka-ui-api/src/main/resources/application.yml" ]
[]
diff --git a/kafka-ui-api/pom.xml b/kafka-ui-api/pom.xml index 14c2067886c..a43c087b428 100644 --- a/kafka-ui-api/pom.xml +++ b/kafka-ui-api/pom.xml @@ -192,10 +192,6 @@ <version>${antlr4-maven-plugin.version}</version> </dependency> - <dependency> - <groupId>org.springframework.boot</groupId> - <artifactId>spring-boot-starter-data-ldap</artifactId> - </dependency> <dependency> <groupId>org.springframework.security</groupId> <artifactId>spring-security-ldap</artifactId> diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/KafkaUiApplication.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/KafkaUiApplication.java index ded03514fee..a9a523eb850 100644 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/KafkaUiApplication.java +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/KafkaUiApplication.java @@ -2,10 +2,11 @@ import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.SpringBootApplication; +import org.springframework.boot.autoconfigure.ldap.LdapAutoConfiguration; import org.springframework.scheduling.annotation.EnableAsync; import org.springframework.scheduling.annotation.EnableScheduling; -@SpringBootApplication +@SpringBootApplication(exclude = LdapAutoConfiguration.class) @EnableScheduling @EnableAsync public class KafkaUiApplication { diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/LdapSecurityConfig.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/LdapSecurityConfig.java index 62fdde4bf02..0ba5c231f4b 100644 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/LdapSecurityConfig.java +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/LdapSecurityConfig.java @@ -4,8 +4,10 @@ import lombok.extern.slf4j.Slf4j; import org.springframework.beans.factory.annotation.Value; import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty; +import org.springframework.boot.autoconfigure.ldap.LdapAutoConfiguration; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; +import org.springframework.context.annotation.Import; import org.springframework.ldap.core.support.BaseLdapPathContextSource; import org.springframework.ldap.core.support.LdapContextSource; import org.springframework.security.authentication.AuthenticationManager; @@ -25,6 +27,7 @@ @Configuration @EnableWebFluxSecurity @ConditionalOnProperty(value = "auth.type", havingValue = "LDAP") +@Import(LdapAutoConfiguration.class) @Slf4j public class LdapSecurityConfig extends AbstractAuthSecurityConfig { diff --git a/kafka-ui-api/src/main/resources/application.yml b/kafka-ui-api/src/main/resources/application.yml index a6a4c8e9716..da070a19a8d 100644 --- a/kafka-ui-api/src/main/resources/application.yml +++ b/kafka-ui-api/src/main/resources/application.yml @@ -11,9 +11,6 @@ management: web: exposure: include: "info,health" - health: - ldap: - enabled: false logging: level:
null
test
train
2022-09-19T14:04:37
"2022-07-26T20:43:28Z"
Haarolean
train
provectus/kafka-ui/2435_2606
provectus/kafka-ui
provectus/kafka-ui/2435
provectus/kafka-ui/2606
[ "timestamp(timedelta=1.0, similarity=0.9310576636793927)", "connected" ]
e621a172d5d9a065dd1d6064099bac3aadc32549
93852b260043f805f71bc670744b69c0bd407da6
[]
[ "is this additional line needed? why we can't call this instance inside the lambda?", "why did we add this step?", "the goal of this PR is to make test interact with objects. why did you leave constants like this?", "is that topic or topic name? i've asked you to refactor naming or create objects here", "is this instance really message?? here we also have an discussion", "Refactored", "Deleted this step", "Refactored", "Done ", "discuss and refactored " ]
"2022-09-19T09:39:24Z"
[ "type/enhancement", "scope/QA", "status/accepted" ]
[e2e] create object models and implement them into tests
need to create 'models' package in java/ create class with regarding name for used objects in framework, for example User.class or Topic.class using lombok library need to create related options for objects
[ "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/helpers/ApiHelper.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaCreateView.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaEditView.java" ]
[ "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/helpers/ApiHelper.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/models/Connector.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/models/Schema.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/models/Topic.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaCreateView.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaEditView.java" ]
[ "kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/ConnectorsTests.java", "kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/SchemasTests.java", "kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/TopicTests.java" ]
diff --git a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/helpers/ApiHelper.java b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/helpers/ApiHelper.java index 62d8477fc0b..b094a243a8f 100644 --- a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/helpers/ApiHelper.java +++ b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/helpers/ApiHelper.java @@ -6,7 +6,13 @@ import com.provectus.kafka.ui.api.api.MessagesApi; import com.provectus.kafka.ui.api.api.SchemasApi; import com.provectus.kafka.ui.api.api.TopicsApi; -import com.provectus.kafka.ui.api.model.*; +import com.provectus.kafka.ui.api.model.CreateTopicMessage; +import com.provectus.kafka.ui.api.model.NewConnector; +import com.provectus.kafka.ui.api.model.NewSchemaSubject; +import com.provectus.kafka.ui.api.model.TopicCreation; +import com.provectus.kafka.ui.models.Connector; +import com.provectus.kafka.ui.models.Schema; +import com.provectus.kafka.ui.models.Topic; import lombok.SneakyThrows; import lombok.extern.slf4j.Slf4j; import org.springframework.web.reactive.function.client.WebClientResponseException; @@ -15,6 +21,7 @@ import java.util.Map; import static com.codeborne.selenide.Selenide.sleep; +import static com.provectus.kafka.ui.extensions.FileUtils.fileToString; @Slf4j @@ -67,11 +74,11 @@ public void deleteTopic(String clusterName, String topicName) { } @SneakyThrows - public void createSchema(String clusterName, String schemaName, SchemaType type, String schemaValue) { + public void createSchema(String clusterName, Schema schema) { NewSchemaSubject schemaSubject = new NewSchemaSubject(); - schemaSubject.setSubject(schemaName); - schemaSubject.setSchema(schemaValue); - schemaSubject.setSchemaType(type); + schemaSubject.setSubject(schema.getName()); + schemaSubject.setSchema(fileToString(schema.getValuePath())); + schemaSubject.setSchemaType(schema.getType()); try { schemaApi().createNewSchema(clusterName, schemaSubject).block(); } catch (WebClientResponseException ex) { @@ -96,16 +103,16 @@ public void deleteConnector(String clusterName, String connectName, String conne } @SneakyThrows - public void createConnector(String clusterName, String connectName, String connectorName, String configJson) { - NewConnector connector = new NewConnector(); - connector.setName(connectorName); - Map<String, Object> configMap = new ObjectMapper().readValue(configJson, HashMap.class); - connector.setConfig(configMap); + public void createConnector(String clusterName, String connectName, Connector connector) { + NewConnector connectorProperties = new NewConnector(); + connectorProperties.setName(connector.getName()); + Map<String, Object> configMap = new ObjectMapper().readValue(connector.getConfig(), HashMap.class); + connectorProperties.setConfig(configMap); try { - connectorApi().deleteConnector(clusterName, connectName, connectorName).block(); + connectorApi().deleteConnector(clusterName, connectName, connector.getName()).block(); } catch (WebClientResponseException ignored) { } - connectorApi().createConnector(clusterName, connectName, connector).block(); + connectorApi().createConnector(clusterName, connectName, connectorProperties).block(); } public String getFirstConnectName(String clusterName) { @@ -113,14 +120,13 @@ public String getFirstConnectName(String clusterName) { } @SneakyThrows - public void sendMessage(String clusterName, String topicName, String messageContentJson, - String messageKey) { + public void sendMessage(String clusterName, Topic topic) { CreateTopicMessage createMessage = new CreateTopicMessage(); createMessage.partition(0); - createMessage.setContent(messageContentJson); - createMessage.setKey(messageKey); + createMessage.setContent(topic.getMessageContent()); + createMessage.setKey(topic.getMessageKey()); try { - messageApi().sendTopicMessages(clusterName, topicName, createMessage).block(); + messageApi().sendTopicMessages(clusterName, topic.getName(), createMessage).block(); } catch (WebClientResponseException ex) { ex.getRawStatusCode(); } diff --git a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/models/Connector.java b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/models/Connector.java new file mode 100644 index 00000000000..9e30ba9f196 --- /dev/null +++ b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/models/Connector.java @@ -0,0 +1,12 @@ +package com.provectus.kafka.ui.models; + +import lombok.Data; +import lombok.experimental.Accessors; + +@Data +@Accessors(chain = true) +public class Connector { + + private String name, config; + +} diff --git a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/models/Schema.java b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/models/Schema.java new file mode 100644 index 00000000000..8097c2e7d72 --- /dev/null +++ b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/models/Schema.java @@ -0,0 +1,33 @@ +package com.provectus.kafka.ui.models; + +import com.provectus.kafka.ui.api.model.SchemaType; +import lombok.Data; +import lombok.experimental.Accessors; + +import static org.apache.commons.lang.RandomStringUtils.randomAlphabetic; + +@Data +@Accessors(chain = true) +public class Schema { + + private String name,valuePath; + private SchemaType type; + + public static Schema createSchemaAvro(){ + return new Schema().setName(randomAlphabetic(10)) + .setType(SchemaType.AVRO) + .setValuePath(System.getProperty("user.dir") + "/src/main/resources/testData/schema_avro_value.json"); + } + + public static Schema createSchemaJson(){ + return new Schema().setName(randomAlphabetic(10)) + .setType(SchemaType.JSON) + .setValuePath(System.getProperty("user.dir") + "/src/main/resources/testData/schema_Json_Value.json"); + } + + public static Schema createSchemaProtobuf(){ + return new Schema().setName(randomAlphabetic(10)) + .setType(SchemaType.PROTOBUF) + .setValuePath(System.getProperty("user.dir") + "/src/main/resources/testData/schema_protobuf_value.txt"); + } +} diff --git a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/models/Topic.java b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/models/Topic.java new file mode 100644 index 00000000000..725db0dd8dd --- /dev/null +++ b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/models/Topic.java @@ -0,0 +1,10 @@ +package com.provectus.kafka.ui.models; + +import lombok.Data; +import lombok.experimental.Accessors; + +@Data +@Accessors(chain = true) +public class Topic { + private String name, compactPolicyValue, timeToRetainData, maxSizeOnDisk, maxMessageBytes, messageKey, messageContent ; +} diff --git a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaCreateView.java b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaCreateView.java index 11940326f42..c41153bbdb4 100644 --- a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaCreateView.java +++ b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaCreateView.java @@ -1,6 +1,7 @@ package com.provectus.kafka.ui.pages.schema; import com.codeborne.selenide.SelenideElement; +import com.provectus.kafka.ui.api.model.SchemaType; import com.provectus.kafka.ui.utils.BrowserUtils; import io.qameta.allure.Step; import org.openqa.selenium.By; @@ -34,19 +35,4 @@ public SchemaCreateView setSchemaField(String text) { schemaField.setValue(text); return this; } - - public enum SchemaType { - AVRO("AVRO"), - JSON("JSON"), - PROTOBUF("PROTOBUF"); - - final String value; - - SchemaType(String value) { - this.value = value; - } - public String getValue(){ - return value; - } - } } diff --git a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaEditView.java b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaEditView.java index 716e7ef80c1..9e853908e5f 100644 --- a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaEditView.java +++ b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/schema/SchemaEditView.java @@ -4,6 +4,7 @@ import com.codeborne.selenide.Selenide; import com.codeborne.selenide.SelenideElement; import com.provectus.kafka.ui.api.model.CompatibilityLevel; +import com.provectus.kafka.ui.api.model.SchemaType; import com.provectus.kafka.ui.utils.BrowserUtils; import io.qameta.allure.Step; import org.openqa.selenium.By; @@ -18,7 +19,7 @@ public class SchemaEditView { protected SelenideElement schemaTypeDropDown = $x("//ul[@name='schemaType']"); @Step - public SchemaEditView selectSchemaTypeFromDropdown(SchemaCreateView.SchemaType schemaType) { + public SchemaEditView selectSchemaTypeFromDropdown(SchemaType schemaType) { $x("//ul[@name='schemaType']").click(); $x("//li[text()='" + schemaType.getValue() + "']").click(); return this;
diff --git a/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/ConnectorsTests.java b/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/ConnectorsTests.java index 5f11151c8b0..19682d4be0c 100644 --- a/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/ConnectorsTests.java +++ b/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/ConnectorsTests.java @@ -1,8 +1,9 @@ package com.provectus.kafka.ui.tests; import com.provectus.kafka.ui.base.BaseTest; -import com.provectus.kafka.ui.helpers.ApiHelper; import com.provectus.kafka.ui.helpers.Helpers; +import com.provectus.kafka.ui.models.Connector; +import com.provectus.kafka.ui.models.Topic; import com.provectus.kafka.ui.utils.qaseIO.Status; import com.provectus.kafka.ui.utils.qaseIO.annotation.AutomationStatus; import com.provectus.kafka.ui.utils.qaseIO.annotation.Suite; @@ -12,98 +13,102 @@ import org.junit.jupiter.api.DisplayName; import org.junit.jupiter.api.Test; +import java.util.ArrayList; +import java.util.List; + import static com.provectus.kafka.ui.extensions.FileUtils.getResourceAsString; public class ConnectorsTests extends BaseTest { - - private final long suiteId = 10; - private final String suiteTitle = "Kafka Connect"; - public static final String SINK_CONNECTOR = "sink_postgres_activities_e2e_checks"; - public static final String TOPIC_FOR_CONNECTOR = "topic_for_connector"; - public static final String TOPIC_FOR_DELETE_CONNECTOR = "topic_for_delete_connector"; - public static final String TOPIC_FOR_UPDATE_CONNECTOR = "topic_for_update_connector"; - public static final String FIRST_CONNECTOR = "first"; - public static final String CONNECTOR_FOR_DELETE = "sink_postgres_activities_e2e_checks_for_delete"; - public static final String CONNECTOR_FOR_UPDATE = "sink_postgres_activities_e2e_checks_for_update"; + private static final long SUITE_ID = 10; + private static final String SUITE_TITLE = "Kafka Connect"; + private static final String CONNECT_NAME = "first"; + private static final List<Topic> TOPIC_LIST = new ArrayList<>(); + private static final List<Connector> CONNECTOR_LIST = new ArrayList<>(); + private static final String MESSAGE_CONTENT = "message_content_create_topic.json"; + private static final String MESSAGE_KEY = " "; + private static final Topic TOPIC_FOR_CREATE = new Topic() + .setName("topic_for_create_connector") + .setMessageContent(MESSAGE_CONTENT).setMessageKey(MESSAGE_KEY); + private static final Topic TOPIC_FOR_DELETE = new Topic() + .setName("topic_for_delete_connector") + .setMessageContent(MESSAGE_CONTENT).setMessageKey(MESSAGE_KEY); + private static final Topic TOPIC_FOR_UPDATE = new Topic() + .setName("topic_for_update_connector") + .setMessageContent(MESSAGE_CONTENT).setMessageKey(MESSAGE_KEY); + private static final Connector CONNECTOR_FOR_DELETE = new Connector() + .setName("sink_postgres_activities_e2e_checks_for_delete") + .setConfig(getResourceAsString("delete_connector_config.json")); + private static final Connector CONNECTOR_FOR_UPDATE = new Connector() + .setName("sink_postgres_activities_e2e_checks_for_update") + .setConfig(getResourceAsString("config_for_create_connector_via_api.json")); @BeforeAll public static void beforeAll() { - ApiHelper apiHelper = Helpers.INSTANCE.apiHelper; - - String connectorToDelete = getResourceAsString("delete_connector_config.json"); - String connectorToUpdate = getResourceAsString("config_for_create_connector_via_api.json"); - String message = getResourceAsString("message_content_create_topic.json"); - - apiHelper.deleteTopic(CLUSTER_NAME, CONNECTOR_FOR_DELETE); - - apiHelper.createTopic(CLUSTER_NAME, TOPIC_FOR_CONNECTOR); - apiHelper.sendMessage(CLUSTER_NAME, TOPIC_FOR_CONNECTOR, message, " "); - - apiHelper.createTopic(CLUSTER_NAME, TOPIC_FOR_DELETE_CONNECTOR); - apiHelper.sendMessage(CLUSTER_NAME, TOPIC_FOR_DELETE_CONNECTOR, message, " "); - - apiHelper.createTopic(CLUSTER_NAME, TOPIC_FOR_UPDATE_CONNECTOR); - apiHelper.sendMessage(CLUSTER_NAME, TOPIC_FOR_UPDATE_CONNECTOR, message, " "); - - apiHelper.createConnector(CLUSTER_NAME, FIRST_CONNECTOR, CONNECTOR_FOR_DELETE, connectorToDelete); - apiHelper.createConnector(CLUSTER_NAME, FIRST_CONNECTOR, CONNECTOR_FOR_UPDATE, connectorToUpdate); - } - - @AfterAll - public static void afterAll() { - ApiHelper apiHelper = Helpers.INSTANCE.apiHelper; - apiHelper.deleteConnector(CLUSTER_NAME, FIRST_CONNECTOR, SINK_CONNECTOR); - apiHelper.deleteConnector(CLUSTER_NAME, FIRST_CONNECTOR, CONNECTOR_FOR_UPDATE); - apiHelper.deleteTopic(CLUSTER_NAME, TOPIC_FOR_CONNECTOR); - apiHelper.deleteTopic(CLUSTER_NAME, TOPIC_FOR_DELETE_CONNECTOR); - apiHelper.deleteTopic(CLUSTER_NAME, TOPIC_FOR_UPDATE_CONNECTOR); + TOPIC_LIST.addAll(List.of(TOPIC_FOR_CREATE, TOPIC_FOR_DELETE, TOPIC_FOR_UPDATE)); + TOPIC_LIST.forEach(topic -> { + Helpers.INSTANCE.apiHelper.createTopic(CLUSTER_NAME, topic.getName()); + Helpers.INSTANCE.apiHelper.sendMessage(CLUSTER_NAME, topic); + }); + CONNECTOR_LIST.addAll(List.of(CONNECTOR_FOR_DELETE, CONNECTOR_FOR_UPDATE)); + CONNECTOR_LIST.forEach(connector -> Helpers.INSTANCE.apiHelper + .createConnector(CLUSTER_NAME, CONNECT_NAME, connector)); } @DisplayName("should create a connector") - @Suite(suiteId = suiteId, title = suiteTitle) + @Suite(suiteId = SUITE_ID, title = SUITE_TITLE) @AutomationStatus(status = Status.AUTOMATED) @CaseId(42) @Test public void createConnector() { + Connector connectorForCreate = new Connector() + .setName("sink_postgres_activities_e2e_checks") + .setConfig(getResourceAsString("config_for_create_connector.json")); pages.openConnectorsList(CLUSTER_NAME) .waitUntilScreenReady() .clickCreateConnectorButton() .waitUntilScreenReady() - .setConnectorConfig( - SINK_CONNECTOR, - getResourceAsString("config_for_create_connector.json")); + .setConnectorConfig(connectorForCreate.getName(), connectorForCreate.getConfig()); pages.openConnectorsList(CLUSTER_NAME) .waitUntilScreenReady() - .connectorIsVisibleInList(SINK_CONNECTOR, TOPIC_FOR_CONNECTOR); + .connectorIsVisibleInList(connectorForCreate.getName(), TOPIC_FOR_CREATE.getName()); + CONNECTOR_LIST.add(connectorForCreate); } @DisplayName("should update a connector") - @Suite(suiteId = suiteId, title = suiteTitle) + @Suite(suiteId = SUITE_ID, title = SUITE_TITLE) @AutomationStatus(status = Status.AUTOMATED) @CaseId(196) @Test public void updateConnector() { pages.openConnectorsList(CLUSTER_NAME) .waitUntilScreenReady() - .openConnector(CONNECTOR_FOR_UPDATE); + .openConnector(CONNECTOR_FOR_UPDATE.getName()); pages.connectorsView.connectorIsVisibleOnOverview(); pages.connectorsView.openEditConfig() - .updConnectorConfig(getResourceAsString("config_for_update_connector.json")); + .updConnectorConfig(CONNECTOR_FOR_UPDATE.getConfig()); pages.openConnectorsList(CLUSTER_NAME) - .connectorIsVisibleInList(CONNECTOR_FOR_UPDATE, TOPIC_FOR_UPDATE_CONNECTOR); + .connectorIsVisibleInList(CONNECTOR_FOR_UPDATE.getName(), TOPIC_FOR_UPDATE.getName()); } @DisplayName("should delete connector") - @Suite(suiteId = suiteId, title = suiteTitle) + @Suite(suiteId = SUITE_ID, title = SUITE_TITLE) @AutomationStatus(status = Status.AUTOMATED) @CaseId(195) @Test public void deleteConnector() { pages.openConnectorsList(CLUSTER_NAME) .waitUntilScreenReady() - .openConnector(CONNECTOR_FOR_DELETE); + .openConnector(CONNECTOR_FOR_DELETE.getName()); pages.connectorsView.clickDeleteButton(); pages.openConnectorsList(CLUSTER_NAME) - .isNotVisible(CONNECTOR_FOR_DELETE); + .isNotVisible(CONNECTOR_FOR_DELETE.getName()); + CONNECTOR_LIST.remove(CONNECTOR_FOR_DELETE); + } + + @AfterAll + public static void afterAll() { + CONNECTOR_LIST.forEach(connector -> + Helpers.INSTANCE.apiHelper.deleteConnector(CLUSTER_NAME, CONNECT_NAME, connector.getName())); + TOPIC_LIST.forEach(topic -> Helpers.INSTANCE.apiHelper.deleteTopic(CLUSTER_NAME, topic.getName())); } } diff --git a/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/SchemasTests.java b/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/SchemasTests.java index 22c8db453d9..9218548d9de 100644 --- a/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/SchemasTests.java +++ b/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/SchemasTests.java @@ -1,99 +1,85 @@ package com.provectus.kafka.ui.tests; import com.provectus.kafka.ui.api.model.CompatibilityLevel; -import com.provectus.kafka.ui.api.model.SchemaType; import com.provectus.kafka.ui.base.BaseTest; import com.provectus.kafka.ui.helpers.Helpers; +import com.provectus.kafka.ui.models.Schema; import com.provectus.kafka.ui.pages.MainPage; -import com.provectus.kafka.ui.pages.schema.SchemaCreateView; import com.provectus.kafka.ui.pages.schema.SchemaEditView; import com.provectus.kafka.ui.utils.qaseIO.Status; import com.provectus.kafka.ui.utils.qaseIO.annotation.AutomationStatus; import com.provectus.kafka.ui.utils.qaseIO.annotation.Suite; import io.qase.api.annotation.CaseId; +import lombok.SneakyThrows; import org.junit.jupiter.api.*; +import java.util.ArrayList; +import java.util.List; + import static com.provectus.kafka.ui.extensions.FileUtils.fileToString; + @TestMethodOrder(MethodOrderer.OrderAnnotation.class) public class SchemasTests extends BaseTest { - - private final long suiteId = 11; - private final String suiteTitle = "Schema Registry"; - public static final String SCHEMA_AVRO_CREATE = "avro_schema"; - public static final String SCHEMA_JSON_CREATE = "json_schema"; - public static final String SCHEMA_PROTOBUF_CREATE = "protobuf_schema"; - public static final String SCHEMA_AVRO_API_UPDATE = "avro_schema_for_update_api"; - public static final String SCHEMA_AVRO_API = "avro_schema_api"; - public static final String SCHEMA_JSON_API = "json_schema_api"; - public static final String SCHEMA_PROTOBUF_API = "protobuf_schema_api"; - private static final String PATH_AVRO_VALUE = System.getProperty("user.dir") + "/src/test/resources/schema_avro_value.json"; - private static final String PATH_AVRO_FOR_UPDATE = System.getProperty("user.dir") + "/src/test/resources/schema_avro_for_update.json"; - private static final String PATH_PROTOBUF_VALUE = System.getProperty("user.dir") + "/src/test/resources/schema_protobuf_value.txt"; - private static final String PATH_JSON_VALUE = System.getProperty("user.dir") + "/src/test/resources/schema_Json_Value.json"; + private static final long SUITE_ID = 11; + private static final String SUITE_TITLE = "Schema Registry"; + private static final List<Schema> SCHEMA_LIST = new ArrayList<>(); + private static final Schema AVRO_API = Schema.createSchemaAvro(); + private static final Schema JSON_API = Schema.createSchemaJson(); + private static final Schema PROTOBUF_API = Schema.createSchemaProtobuf(); @BeforeAll + @SneakyThrows public static void beforeAll() { - Helpers.INSTANCE.apiHelper.createSchema(CLUSTER_NAME, SCHEMA_AVRO_API_UPDATE, SchemaType.AVRO, fileToString(PATH_AVRO_VALUE)); - Helpers.INSTANCE.apiHelper.createSchema(CLUSTER_NAME, SCHEMA_AVRO_API, SchemaType.AVRO, fileToString(PATH_AVRO_VALUE)); - Helpers.INSTANCE.apiHelper.createSchema(CLUSTER_NAME, SCHEMA_JSON_API, SchemaType.JSON, fileToString(PATH_JSON_VALUE)); - Helpers.INSTANCE.apiHelper.createSchema(CLUSTER_NAME, SCHEMA_PROTOBUF_API, SchemaType.PROTOBUF, fileToString(PATH_PROTOBUF_VALUE)); - } - - @AfterAll - public static void afterAll() { - Helpers.INSTANCE.apiHelper.deleteSchema(CLUSTER_NAME, SCHEMA_AVRO_CREATE); - Helpers.INSTANCE.apiHelper.deleteSchema(CLUSTER_NAME, SCHEMA_JSON_CREATE); - Helpers.INSTANCE.apiHelper.deleteSchema(CLUSTER_NAME, SCHEMA_PROTOBUF_CREATE); - Helpers.INSTANCE.apiHelper.deleteSchema(CLUSTER_NAME, SCHEMA_AVRO_API_UPDATE); - Helpers.INSTANCE.apiHelper.deleteSchema(CLUSTER_NAME, SCHEMA_AVRO_API); - Helpers.INSTANCE.apiHelper.deleteSchema(CLUSTER_NAME, SCHEMA_JSON_API); - Helpers.INSTANCE.apiHelper.deleteSchema(CLUSTER_NAME, SCHEMA_PROTOBUF_API); - + SCHEMA_LIST.addAll(List.of(AVRO_API, JSON_API, PROTOBUF_API)); + SCHEMA_LIST.forEach(schema -> Helpers.INSTANCE.apiHelper.createSchema(CLUSTER_NAME, schema)); } @DisplayName("should create AVRO schema") - @Suite(suiteId = suiteId, title = suiteTitle) + @Suite(suiteId = SUITE_ID, title = SUITE_TITLE) @AutomationStatus(status = Status.AUTOMATED) @CaseId(43) @Test @Order(1) void createSchemaAvro() { + Schema schemaAvro = Schema.createSchemaAvro(); pages.openMainPage() .goToSideMenu(CLUSTER_NAME, MainPage.SideMenuOptions.SCHEMA_REGISTRY); pages.schemaRegistry.clickCreateSchema() - .setSubjectName(SCHEMA_AVRO_CREATE) - .setSchemaField(fileToString(PATH_AVRO_VALUE)) - .selectSchemaTypeFromDropdown(SchemaCreateView.SchemaType.AVRO) + .setSubjectName(schemaAvro.getName()) + .setSchemaField(fileToString(schemaAvro.getValuePath())) + .selectSchemaTypeFromDropdown(schemaAvro.getType()) .clickSubmit() .waitUntilScreenReady(); pages.mainPage .goToSideMenu(CLUSTER_NAME, MainPage.SideMenuOptions.SCHEMA_REGISTRY); - pages.schemaRegistry.isSchemaVisible(SCHEMA_AVRO_CREATE); + pages.schemaRegistry.isSchemaVisible(schemaAvro.getName()); + SCHEMA_LIST.add(schemaAvro); } @DisplayName("should update AVRO schema") - @Suite(suiteId = suiteId, title = suiteTitle) + @Suite(suiteId = SUITE_ID, title = SUITE_TITLE) @AutomationStatus(status = Status.AUTOMATED) @CaseId(186) @Test @Order(2) void updateSchemaAvro() { + AVRO_API.setValuePath(System.getProperty("user.dir") + "/src/main/resources/testData/schema_avro_for_update.json"); pages.openMainPage() .goToSideMenu(CLUSTER_NAME, MainPage.SideMenuOptions.SCHEMA_REGISTRY); - pages.schemaRegistry.openSchema(SCHEMA_AVRO_API_UPDATE) + pages.schemaRegistry.openSchema(AVRO_API.getName()) .waitUntilScreenReady() .openEditSchema(); Assertions.assertTrue(new SchemaEditView().isSchemaDropDownDisabled(),"isSchemaDropDownDisabled()"); new SchemaEditView().selectCompatibilityLevelFromDropdown(CompatibilityLevel.CompatibilityEnum.NONE) - .setNewSchemaValue(fileToString(PATH_AVRO_FOR_UPDATE)) + .setNewSchemaValue(fileToString(AVRO_API.getValuePath())) .clickSubmit() .waitUntilScreenReady() .isCompatibility(CompatibilityLevel.CompatibilityEnum.NONE); } @DisplayName("should delete AVRO schema") - @Suite(suiteId = suiteId, title = suiteTitle) + @Suite(suiteId = SUITE_ID, title = SUITE_TITLE) @AutomationStatus(status = Status.AUTOMATED) @CaseId(187) @Test @@ -101,34 +87,37 @@ void updateSchemaAvro() { void deleteSchemaAvro() { pages.openMainPage() .goToSideMenu(CLUSTER_NAME, MainPage.SideMenuOptions.SCHEMA_REGISTRY); - pages.schemaRegistry.openSchema(SCHEMA_AVRO_API) + pages.schemaRegistry.openSchema(AVRO_API.getName()) .waitUntilScreenReady() .removeSchema() - .isNotVisible(SCHEMA_AVRO_API); + .isNotVisible(AVRO_API.getName()); + SCHEMA_LIST.remove(AVRO_API); } @DisplayName("should create JSON schema") - @Suite(suiteId = suiteId, title = suiteTitle) + @Suite(suiteId = SUITE_ID, title = SUITE_TITLE) @AutomationStatus(status = Status.AUTOMATED) @CaseId(89) @Test @Order(4) void createSchemaJson() { + Schema schemaJson = Schema.createSchemaJson(); pages.openMainPage() .goToSideMenu(CLUSTER_NAME, MainPage.SideMenuOptions.SCHEMA_REGISTRY); pages.schemaRegistry.clickCreateSchema() - .setSubjectName(SCHEMA_JSON_CREATE) - .setSchemaField(fileToString(PATH_JSON_VALUE)) - .selectSchemaTypeFromDropdown(SchemaCreateView.SchemaType.JSON) + .setSubjectName(schemaJson.getName()) + .setSchemaField(fileToString(schemaJson.getValuePath())) + .selectSchemaTypeFromDropdown(schemaJson.getType()) .clickSubmit() .waitUntilScreenReady(); pages.mainPage .goToSideMenu(CLUSTER_NAME, MainPage.SideMenuOptions.SCHEMA_REGISTRY); - pages.schemaRegistry.isSchemaVisible(SCHEMA_JSON_CREATE); + pages.schemaRegistry.isSchemaVisible(schemaJson.getName()); + SCHEMA_LIST.add(schemaJson); } @DisplayName("should delete JSON schema") - @Suite(suiteId = suiteId, title = suiteTitle) + @Suite(suiteId = SUITE_ID, title = SUITE_TITLE) @AutomationStatus(status = Status.AUTOMATED) @CaseId(189) @Test @@ -136,34 +125,37 @@ void createSchemaJson() { void deleteSchemaJson() { pages.openMainPage() .goToSideMenu(CLUSTER_NAME, MainPage.SideMenuOptions.SCHEMA_REGISTRY); - pages.schemaRegistry.openSchema(SCHEMA_JSON_API) + pages.schemaRegistry.openSchema(JSON_API.getName()) .waitUntilScreenReady() .removeSchema() - .isNotVisible(SCHEMA_JSON_API); + .isNotVisible(JSON_API.getName()); + SCHEMA_LIST.remove(JSON_API); } @DisplayName("should create PROTOBUF schema") - @Suite(suiteId = suiteId, title = suiteTitle) + @Suite(suiteId = SUITE_ID, title = SUITE_TITLE) @AutomationStatus(status = Status.AUTOMATED) @CaseId(91) @Test @Order(6) void createSchemaProtobuf() { + Schema schemaProtobuf = Schema.createSchemaProtobuf(); pages.openMainPage() .goToSideMenu(CLUSTER_NAME, MainPage.SideMenuOptions.SCHEMA_REGISTRY); pages.schemaRegistry.clickCreateSchema() - .setSubjectName(SCHEMA_PROTOBUF_CREATE) - .setSchemaField(fileToString(PATH_PROTOBUF_VALUE)) - .selectSchemaTypeFromDropdown(SchemaCreateView.SchemaType.PROTOBUF) + .setSubjectName(schemaProtobuf.getName()) + .setSchemaField(fileToString(schemaProtobuf.getValuePath())) + .selectSchemaTypeFromDropdown(schemaProtobuf.getType()) .clickSubmit() .waitUntilScreenReady(); pages.mainPage .goToSideMenu(CLUSTER_NAME, MainPage.SideMenuOptions.SCHEMA_REGISTRY); - pages.schemaRegistry.isSchemaVisible(SCHEMA_PROTOBUF_CREATE); + pages.schemaRegistry.isSchemaVisible(schemaProtobuf.getName()); + SCHEMA_LIST.add(schemaProtobuf); } @DisplayName("should delete PROTOBUF schema") - @Suite(suiteId = suiteId, title = suiteTitle) + @Suite(suiteId = SUITE_ID, title = SUITE_TITLE) @AutomationStatus(status = Status.AUTOMATED) @CaseId(223) @Test @@ -171,9 +163,15 @@ void createSchemaProtobuf() { void deleteSchemaProtobuf() { pages.openMainPage() .goToSideMenu(CLUSTER_NAME, MainPage.SideMenuOptions.SCHEMA_REGISTRY); - pages.schemaRegistry.openSchema(SCHEMA_PROTOBUF_API) + pages.schemaRegistry.openSchema(PROTOBUF_API.getName()) .waitUntilScreenReady() .removeSchema() - .isNotVisible(SCHEMA_PROTOBUF_API); + .isNotVisible(PROTOBUF_API.getName()); + SCHEMA_LIST.remove(PROTOBUF_API); + } + + @AfterAll + public static void afterAll() { + SCHEMA_LIST.forEach(schema -> Helpers.INSTANCE.apiHelper.deleteSchema(CLUSTER_NAME, schema.getName())); } } diff --git a/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/TopicTests.java b/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/TopicTests.java index 37d3f9b01d0..8f74109631d 100644 --- a/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/TopicTests.java +++ b/kafka-ui-e2e-checks/src/test/java/com/provectus/kafka/ui/tests/TopicTests.java @@ -2,6 +2,7 @@ import com.provectus.kafka.ui.base.BaseTest; import com.provectus.kafka.ui.helpers.Helpers; +import com.provectus.kafka.ui.models.Topic; import com.provectus.kafka.ui.pages.MainPage; import com.provectus.kafka.ui.pages.topic.TopicView; import com.provectus.kafka.ui.utils.qaseIO.Status; @@ -11,32 +12,29 @@ import io.qase.api.annotation.CaseId; import org.junit.jupiter.api.*; +import java.util.ArrayList; +import java.util.List; + import static com.provectus.kafka.ui.extensions.FileUtils.fileToString; public class TopicTests extends BaseTest { - - public static final String NEW_TOPIC = "new-topic"; - public static final String TOPIC_TO_UPDATE = "topic-to-update"; - public static final String TOPIC_TO_DELETE = "topic-to-delete"; - public static final String COMPACT_POLICY_VALUE = "Compact"; - public static final String UPDATED_TIME_TO_RETAIN_VALUE = "604800001"; - public static final String UPDATED_MAX_SIZE_ON_DISK = "20 GB"; - public static final String UPDATED_MAX_MESSAGE_BYTES = "1000020"; - private static final String KEY_TO_PRODUCE_MESSAGE = System.getProperty("user.dir") + "/src/test/resources/producedkey.txt"; - private static final String CONTENT_TO_PRODUCE_MESSAGE = System.getProperty("user.dir") + "/src/test/resources/testData.txt"; - + private static final long SUITE_ID = 2; + private static final String SUITE_TITLE = "Topics"; + private static final Topic TOPIC_FOR_UPDATE = new Topic() + .setName("topic-to-update") + .setCompactPolicyValue("Compact") + .setTimeToRetainData("604800001") + .setMaxSizeOnDisk("20 GB") + .setMaxMessageBytes("1000020") + .setMessageKey(fileToString(System.getProperty("user.dir") + "/src/test/resources/producedkey.txt")) + .setMessageContent(fileToString(System.getProperty("user.dir") + "/src/test/resources/testData.txt")); + private static final Topic TOPIC_FOR_DELETE = new Topic().setName("topic-to-delete"); + private static final List<Topic> TOPIC_LIST = new ArrayList<>(); @BeforeAll public static void beforeAll() { - Helpers.INSTANCE.apiHelper.createTopic(CLUSTER_NAME, TOPIC_TO_UPDATE); - Helpers.INSTANCE.apiHelper.createTopic(CLUSTER_NAME, TOPIC_TO_DELETE); - } - - @AfterAll - public static void afterAll() { - Helpers.INSTANCE.apiHelper.deleteTopic(CLUSTER_NAME, TOPIC_TO_UPDATE); - Helpers.INSTANCE.apiHelper.deleteTopic(CLUSTER_NAME, TOPIC_TO_DELETE); - Helpers.INSTANCE.apiHelper.deleteTopic(CLUSTER_NAME, NEW_TOPIC); + TOPIC_LIST.addAll(List.of(TOPIC_FOR_UPDATE, TOPIC_FOR_DELETE)); + TOPIC_LIST.forEach(topic -> Helpers.INSTANCE.apiHelper.createTopic(CLUSTER_NAME, topic.getName())); } @DisplayName("should create a topic") @@ -45,84 +43,88 @@ public static void afterAll() { @CaseId(199) @Test public void createTopic() { + Topic topicToCreate = new Topic().setName("new-topic"); pages.open() .goToSideMenu(CLUSTER_NAME, MainPage.SideMenuOptions.TOPICS); pages.topicsList.pressCreateNewTopic() - .setTopicName(NEW_TOPIC) + .setTopicName(topicToCreate.getName()) .sendData() .waitUntilScreenReady(); pages.open() .goToSideMenu(CLUSTER_NAME, MainPage.SideMenuOptions.TOPICS) - .topicIsVisible(NEW_TOPIC); - helpers.apiHelper.deleteTopic(CLUSTER_NAME, NEW_TOPIC); - pages.open() - .goToSideMenu(CLUSTER_NAME, MainPage.SideMenuOptions.TOPICS) - .topicIsNotVisible(NEW_TOPIC); + .topicIsVisible(topicToCreate.getName()); + TOPIC_LIST.add(topicToCreate); } + @Disabled("Due to issue https://github.com/provectus/kafka-ui/issues/1500 ignore this test") @DisplayName("should update a topic") @Issue("1500") - @Suite(suiteId = 2, title = "Topics") + @Suite(suiteId = SUITE_ID, title = SUITE_TITLE) @AutomationStatus(status = Status.AUTOMATED) @CaseId(197) @Test public void updateTopic() { pages.openTopicsList(CLUSTER_NAME) .waitUntilScreenReady(); - pages.openTopicView(CLUSTER_NAME, TOPIC_TO_UPDATE) + pages.openTopicView(CLUSTER_NAME, TOPIC_FOR_UPDATE.getName()) .waitUntilScreenReady() .openEditSettings() - .selectCleanupPolicy(COMPACT_POLICY_VALUE) + .selectCleanupPolicy(TOPIC_FOR_UPDATE.getCompactPolicyValue()) .setMinInsyncReplicas(10) - .setTimeToRetainDataInMs(UPDATED_TIME_TO_RETAIN_VALUE) - .setMaxSizeOnDiskInGB(UPDATED_MAX_SIZE_ON_DISK) - .setMaxMessageBytes(UPDATED_MAX_MESSAGE_BYTES) + .setTimeToRetainDataInMs(TOPIC_FOR_UPDATE.getTimeToRetainData()) + .setMaxSizeOnDiskInGB(TOPIC_FOR_UPDATE.getMaxSizeOnDisk()) + .setMaxMessageBytes(TOPIC_FOR_UPDATE.getMaxMessageBytes()) .sendData() .waitUntilScreenReady(); - pages.openTopicsList(CLUSTER_NAME) .waitUntilScreenReady(); - pages.openTopicView(CLUSTER_NAME, TOPIC_TO_UPDATE) + pages.openTopicView(CLUSTER_NAME, TOPIC_FOR_UPDATE.getName()) .openEditSettings() // Assertions - .cleanupPolicyIs(COMPACT_POLICY_VALUE) - .timeToRetainIs(UPDATED_TIME_TO_RETAIN_VALUE) - .maxSizeOnDiskIs(UPDATED_MAX_SIZE_ON_DISK) - .maxMessageBytesIs(UPDATED_MAX_MESSAGE_BYTES); + .cleanupPolicyIs(TOPIC_FOR_UPDATE.getCompactPolicyValue()) + .timeToRetainIs(TOPIC_FOR_UPDATE.getTimeToRetainData()) + .maxSizeOnDiskIs(TOPIC_FOR_UPDATE.getMaxSizeOnDisk()) + .maxMessageBytesIs(TOPIC_FOR_UPDATE.getMaxMessageBytes()); } @DisplayName("should delete topic") - @Suite(suiteId = 2, title = "Topics") + @Suite(suiteId = SUITE_ID, title = SUITE_TITLE) @AutomationStatus(status = Status.AUTOMATED) @CaseId(207) @Test public void deleteTopic() { pages.openTopicsList(CLUSTER_NAME) .waitUntilScreenReady() - .openTopic(TOPIC_TO_DELETE) + .openTopic(TOPIC_FOR_DELETE.getName()) .waitUntilScreenReady() .deleteTopic(); pages.openTopicsList(CLUSTER_NAME) .waitUntilScreenReady() - .isTopicNotVisible(TOPIC_TO_DELETE); + .isTopicNotVisible(TOPIC_FOR_DELETE.getName()); + TOPIC_LIST.remove(TOPIC_FOR_DELETE); } @DisplayName("produce message") - @Suite(suiteId = 2, title = "Topics") + @Suite(suiteId = SUITE_ID, title = SUITE_TITLE) @AutomationStatus(status = Status.AUTOMATED) @CaseId(222) @Test void produceMessage() { pages.openTopicsList(CLUSTER_NAME) .waitUntilScreenReady() - .openTopic(TOPIC_TO_UPDATE) + .openTopic(TOPIC_FOR_UPDATE.getName()) .waitUntilScreenReady() .openTopicMenu(TopicView.TopicMenu.MESSAGES) .clickOnButton("Produce Message") - .setContentFiled(fileToString(CONTENT_TO_PRODUCE_MESSAGE)) - .setKeyField(fileToString(KEY_TO_PRODUCE_MESSAGE)) + .setContentFiled(TOPIC_FOR_UPDATE.getMessageContent()) + .setKeyField(TOPIC_FOR_UPDATE.getMessageKey()) .submitProduceMessage(); - Assertions.assertTrue(pages.topicView.isKeyMessageVisible(fileToString(KEY_TO_PRODUCE_MESSAGE))); - Assertions.assertTrue(pages.topicView.isContentMessageVisible(fileToString(CONTENT_TO_PRODUCE_MESSAGE).trim())); + Assertions.assertTrue(pages.topicView.isKeyMessageVisible(TOPIC_FOR_UPDATE.getMessageKey())); + Assertions.assertTrue(pages.topicView.isContentMessageVisible(TOPIC_FOR_UPDATE.getMessageContent().trim())); + } + + @AfterAll + public static void afterAll() { + TOPIC_LIST.forEach(topic -> Helpers.INSTANCE.apiHelper.deleteTopic(CLUSTER_NAME, topic.getName())); } }
val
train
2022-09-20T10:24:04
"2022-08-12T06:47:08Z"
VladSenyuta
train
provectus/kafka-ui/2328_2608
provectus/kafka-ui
provectus/kafka-ui/2328
provectus/kafka-ui/2608
[ "timestamp(timedelta=0.0, similarity=0.9427111022466336)", "keyword_pr_to_issue" ]
9962d29926436ffa10d18db02edb839ee28e3951
e621a172d5d9a065dd1d6064099bac3aadc32549
[]
[]
"2022-09-19T13:35:47Z"
[ "type/enhancement", "good first issue", "scope/frontend", "status/accepted" ]
SR: Display additional fields in the table and overview
SR table: display schema type and id as a separate columns SR subject overview: display type
[ "kafka-ui-react-app/src/components/Schemas/Details/Details.tsx", "kafka-ui-react-app/src/components/Schemas/Details/LatestVersion/LatestVersionItem.tsx", "kafka-ui-react-app/src/components/Schemas/Details/SchemaVersion/SchemaVersion.tsx", "kafka-ui-react-app/src/components/Schemas/Details/__test__/SchemaVersion.spec.tsx", "kafka-ui-react-app/src/components/Schemas/List/List.tsx", "kafka-ui-react-app/src/components/Schemas/List/__test__/List.spec.tsx" ]
[ "kafka-ui-react-app/src/components/Schemas/Details/Details.tsx", "kafka-ui-react-app/src/components/Schemas/Details/LatestVersion/LatestVersionItem.tsx", "kafka-ui-react-app/src/components/Schemas/Details/SchemaVersion/SchemaVersion.tsx", "kafka-ui-react-app/src/components/Schemas/Details/__test__/SchemaVersion.spec.tsx", "kafka-ui-react-app/src/components/Schemas/List/List.tsx", "kafka-ui-react-app/src/components/Schemas/List/__test__/List.spec.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/Schemas/Details/Details.tsx b/kafka-ui-react-app/src/components/Schemas/Details/Details.tsx index 847079f3da7..fb27336493a 100644 --- a/kafka-ui-react-app/src/components/Schemas/Details/Details.tsx +++ b/kafka-ui-react-app/src/components/Schemas/Details/Details.tsx @@ -124,6 +124,7 @@ const Details: React.FC = () => { <TableHeaderCell /> <TableHeaderCell title="Version" /> <TableHeaderCell title="ID" /> + <TableHeaderCell title="Type" /> </tr> </thead> <tbody> diff --git a/kafka-ui-react-app/src/components/Schemas/Details/LatestVersion/LatestVersionItem.tsx b/kafka-ui-react-app/src/components/Schemas/Details/LatestVersion/LatestVersionItem.tsx index 5cc472d9f0f..21d443a68c8 100644 --- a/kafka-ui-react-app/src/components/Schemas/Details/LatestVersion/LatestVersionItem.tsx +++ b/kafka-ui-react-app/src/components/Schemas/Details/LatestVersion/LatestVersionItem.tsx @@ -26,6 +26,10 @@ const LatestVersionItem: React.FC<LatestVersionProps> = ({ <S.MetaDataLabel>ID</S.MetaDataLabel> <p>{id}</p> </div> + <div> + <S.MetaDataLabel>Type</S.MetaDataLabel> + <p>{schemaType}</p> + </div> <div> <S.MetaDataLabel>Subject</S.MetaDataLabel> <p>{subject}</p> diff --git a/kafka-ui-react-app/src/components/Schemas/Details/SchemaVersion/SchemaVersion.tsx b/kafka-ui-react-app/src/components/Schemas/Details/SchemaVersion/SchemaVersion.tsx index 875627d452c..1a729a6eaec 100644 --- a/kafka-ui-react-app/src/components/Schemas/Details/SchemaVersion/SchemaVersion.tsx +++ b/kafka-ui-react-app/src/components/Schemas/Details/SchemaVersion/SchemaVersion.tsx @@ -25,7 +25,8 @@ const SchemaVersion: React.FC<SchemaVersionProps> = ({ </IconButtonWrapper> </td> <td style={{ width: '6%' }}>{version}</td> - <td>{id}</td> + <td style={{ width: '6%' }}>{id}</td> + <td>{schemaType}</td> </tr> {isOpen && ( <S.Wrapper> diff --git a/kafka-ui-react-app/src/components/Schemas/Details/__test__/SchemaVersion.spec.tsx b/kafka-ui-react-app/src/components/Schemas/Details/__test__/SchemaVersion.spec.tsx index 617c8cc4259..ccd50cd957f 100644 --- a/kafka-ui-react-app/src/components/Schemas/Details/__test__/SchemaVersion.spec.tsx +++ b/kafka-ui-react-app/src/components/Schemas/Details/__test__/SchemaVersion.spec.tsx @@ -17,7 +17,7 @@ const component = ( describe('SchemaVersion', () => { it('renders versions', () => { render(component); - expect(screen.getAllByRole('cell')).toHaveLength(3); + expect(screen.getAllByRole('cell')).toHaveLength(4); expect(screen.queryByTestId('json-viewer')).not.toBeInTheDocument(); userEvent.click(screen.getByRole('button')); }); diff --git a/kafka-ui-react-app/src/components/Schemas/List/List.tsx b/kafka-ui-react-app/src/components/Schemas/List/List.tsx index 8a1344f55e6..1baae27347e 100644 --- a/kafka-ui-react-app/src/components/Schemas/List/List.tsx +++ b/kafka-ui-react-app/src/components/Schemas/List/List.tsx @@ -57,6 +57,8 @@ const List: React.FC = () => { const columns = React.useMemo<ColumnDef<SchemaSubject>[]>( () => [ { header: 'Subject', accessorKey: 'subject', cell: LinkCell }, + { header: 'Id', accessorKey: 'id' }, + { header: 'Type', accessorKey: 'schemaType' }, { header: 'Version', accessorKey: 'version' }, { header: 'Compatibility', accessorKey: 'compatibilityLevel' }, ], diff --git a/kafka-ui-react-app/src/components/Schemas/List/__test__/List.spec.tsx b/kafka-ui-react-app/src/components/Schemas/List/__test__/List.spec.tsx index cefb32819ac..a6636e5d876 100644 --- a/kafka-ui-react-app/src/components/Schemas/List/__test__/List.spec.tsx +++ b/kafka-ui-react-app/src/components/Schemas/List/__test__/List.spec.tsx @@ -110,9 +110,10 @@ describe('List', () => { expect(screen.getByText(schemaVersion2.subject)).toBeInTheDocument(); }); it('handles onRowClick', () => { - const { subject, version, compatibilityLevel } = schemaVersion2; + const { id, schemaType, subject, version, compatibilityLevel } = + schemaVersion2; const row = screen.getByRole('row', { - name: `${subject} ${version} ${compatibilityLevel}`, + name: `${subject} ${id} ${schemaType} ${version} ${compatibilityLevel}`, }); expect(row).toBeInTheDocument(); userEvent.click(row);
null
test
train
2022-09-20T01:37:21
"2022-07-22T22:27:21Z"
Haarolean
train
provectus/kafka-ui/2532_2618
provectus/kafka-ui
provectus/kafka-ui/2532
provectus/kafka-ui/2618
[ "keyword_pr_to_issue", "connected", "timestamp(timedelta=2.0, similarity=1.0000000000000002)" ]
eb062359a9eca8665833d610100525c8832a3a01
3733729a55123781d7e5ca5ea94be582602be93a
[]
[]
"2022-09-20T13:49:02Z"
[ "good first issue", "scope/frontend", "status/accepted", "type/chore" ]
Create schema view is too wide
**Describe the bug** Create schema view is too wide, please make smaller the fields **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** 1. Navigate to Schema registry 2. Press "Create Schema" **Expected behavior** View is too wide, make smaller **Screenshots** <img width="1717" alt="create schema" src="https://user-images.githubusercontent.com/104780608/188563301-dc0106fd-b843-4c74-a948-4c2e1cfc1f39.png">
[ "kafka-ui-react-app/src/components/Schemas/New/New.styled.ts", "kafka-ui-react-app/src/components/Schemas/New/New.tsx" ]
[ "kafka-ui-react-app/src/components/Schemas/New/New.styled.ts", "kafka-ui-react-app/src/components/Schemas/New/New.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/Schemas/New/New.styled.ts b/kafka-ui-react-app/src/components/Schemas/New/New.styled.ts index e7cf70ea276..257c18df1b0 100644 --- a/kafka-ui-react-app/src/components/Schemas/New/New.styled.ts +++ b/kafka-ui-react-app/src/components/Schemas/New/New.styled.ts @@ -6,6 +6,7 @@ export const Form = styled.form` display: flex; flex-direction: column; gap: 16px; + width: 50%; & > button { align-self: flex-start; diff --git a/kafka-ui-react-app/src/components/Schemas/New/New.tsx b/kafka-ui-react-app/src/components/Schemas/New/New.tsx index 21afda96e0d..39fa49e7416 100644 --- a/kafka-ui-react-app/src/components/Schemas/New/New.tsx +++ b/kafka-ui-react-app/src/components/Schemas/New/New.tsx @@ -120,7 +120,7 @@ const New: React.FC = () => { name={name} value={value} onChange={onChange} - minWidth="50%" + minWidth="100%" disabled={isSubmitting} options={SchemaTypeOptions} />
null
val
train
2022-09-26T15:02:48
"2022-09-06T06:34:45Z"
armenuikafka
train
provectus/kafka-ui/2080_2628
provectus/kafka-ui
provectus/kafka-ui/2080
provectus/kafka-ui/2628
[ "timestamp(timedelta=1.0, similarity=0.8582190628575523)", "connected" ]
3f4791ff0a70418836a7de5f263674d128bd43a5
2f786c080b78b39740a68297879775c4fc31e295
[ "Checked this issue - on \"latest\" image its works fine, so we need to review changes applied to master. " ]
[ "let's create an issue for this?" ]
"2022-09-21T18:04:21Z"
[ "type/bug", "scope/backend", "status/accepted" ]
Linked Consumer is not displayed within Topic/Consumers tab
**Describe the bug** Linked Consumer is not displayed within Topic profile /Consumers tab **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** Steps to reproduce the behavior: 1. Navigate to Consumers 2. Open the Consumer profile 3. Click on linked Topic 4. Turn to Consumers tab **Expected behavior** Linked Consumer should display within Topic/Consumer **Screenshots** https://user-images.githubusercontent.com/104780608/171190816-083adac5-a1b7-445e-97ee-0a2905c83709.mov **Additional context** <!-- (Add any other context about the problem here) -->
[ "kafka-ui-api/src/main/java/com/provectus/kafka/ui/mapper/ConsumerGroupMapper.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/InternalConsumerGroup.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ConsumerGroupService.java", "kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml" ]
[ "kafka-ui-api/src/main/java/com/provectus/kafka/ui/mapper/ConsumerGroupMapper.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/InternalConsumerGroup.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/InternalTopicConsumerGroup.java", "kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ConsumerGroupService.java", "kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml" ]
[]
diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/mapper/ConsumerGroupMapper.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/mapper/ConsumerGroupMapper.java index d04484b3817..75c0a99039b 100644 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/mapper/ConsumerGroupMapper.java +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/mapper/ConsumerGroupMapper.java @@ -6,6 +6,7 @@ import com.provectus.kafka.ui.model.ConsumerGroupStateDTO; import com.provectus.kafka.ui.model.ConsumerGroupTopicPartitionDTO; import com.provectus.kafka.ui.model.InternalConsumerGroup; +import com.provectus.kafka.ui.model.InternalTopicConsumerGroup; import java.util.ArrayList; import java.util.HashMap; import java.util.Map; @@ -24,6 +25,20 @@ public static ConsumerGroupDTO toDto(InternalConsumerGroup c) { return convertToConsumerGroup(c, new ConsumerGroupDTO()); } + public static ConsumerGroupDTO toDto(InternalTopicConsumerGroup c) { + ConsumerGroupDTO consumerGroup = new ConsumerGroupDetailsDTO(); + consumerGroup.setTopics(1); //for ui backward-compatibility, need to rm usage from ui + consumerGroup.setGroupId(c.getGroupId()); + consumerGroup.setMembers(c.getMembers()); + consumerGroup.setMessagesBehind(c.getMessagesBehind()); + consumerGroup.setSimple(c.isSimple()); + consumerGroup.setPartitionAssignor(c.getPartitionAssignor()); + consumerGroup.setState(mapConsumerGroupState(c.getState())); + Optional.ofNullable(c.getCoordinator()) + .ifPresent(cd -> consumerGroup.setCoordinator(mapCoordinator(cd))); + return consumerGroup; + } + public static ConsumerGroupDetailsDTO toDetailsDto(InternalConsumerGroup g) { ConsumerGroupDetailsDTO details = convertToConsumerGroup(g, new ConsumerGroupDetailsDTO()); Map<TopicPartition, ConsumerGroupTopicPartitionDTO> partitionMap = new HashMap<>(); diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/InternalConsumerGroup.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/InternalConsumerGroup.java index ab5b6eed388..d7b3a732f1f 100644 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/InternalConsumerGroup.java +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/InternalConsumerGroup.java @@ -4,7 +4,6 @@ import java.util.Map; import java.util.Optional; import java.util.Set; -import java.util.function.Predicate; import java.util.stream.Collectors; import lombok.Builder; import lombok.Data; @@ -62,13 +61,4 @@ public static InternalConsumerGroup create( Optional.ofNullable(description.coordinator()).ifPresent(builder::coordinator); return builder.build(); } - - private InternalConsumerGroup.InternalMember filterConsumerMemberTopic( - InternalConsumerGroup.InternalMember member, Predicate<TopicPartition> partitionsFilter) { - var topicPartitions = member.getAssignment() - .stream() - .filter(partitionsFilter) - .collect(Collectors.toSet()); - return member.toBuilder().assignment(topicPartitions).build(); - } } diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/InternalTopicConsumerGroup.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/InternalTopicConsumerGroup.java new file mode 100644 index 00000000000..82e455ac582 --- /dev/null +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/InternalTopicConsumerGroup.java @@ -0,0 +1,61 @@ +package com.provectus.kafka.ui.model; + +import java.util.Map; +import java.util.Optional; +import javax.annotation.Nullable; +import lombok.Builder; +import lombok.Value; +import org.apache.kafka.clients.admin.ConsumerGroupDescription; +import org.apache.kafka.common.ConsumerGroupState; +import org.apache.kafka.common.Node; +import org.apache.kafka.common.TopicPartition; + +@Value +@Builder +public class InternalTopicConsumerGroup { + + String groupId; + int members; + @Nullable + Long messagesBehind; //null means no committed offsets found for this group + boolean isSimple; + String partitionAssignor; + ConsumerGroupState state; + @Nullable + Node coordinator; + + public static InternalTopicConsumerGroup create( + String topic, + ConsumerGroupDescription g, + Map<TopicPartition, Long> committedOffsets, + Map<TopicPartition, Long> endOffsets) { + return InternalTopicConsumerGroup.builder() + .groupId(g.groupId()) + .members( + (int) g.members().stream() + // counting only members with target topic assignment + .filter(m -> m.assignment().topicPartitions().stream().anyMatch(p -> p.topic().equals(topic))) + .count() + ) + .messagesBehind(calculateMessagesBehind(committedOffsets, endOffsets)) + .isSimple(g.isSimpleConsumerGroup()) + .partitionAssignor(g.partitionAssignor()) + .state(g.state()) + .coordinator(g.coordinator()) + .build(); + } + + @Nullable + private static Long calculateMessagesBehind(Map<TopicPartition, Long> committedOffsets, + Map<TopicPartition, Long> endOffsets) { + if (committedOffsets.isEmpty()) { + return null; + } + return committedOffsets.entrySet().stream() + .mapToLong(e -> + Optional.ofNullable(endOffsets.get(e.getKey())) + .map(o -> o - e.getValue()) + .orElse(0L) + ).sum(); + } +} diff --git a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ConsumerGroupService.java b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ConsumerGroupService.java index 81e4e763e56..a82224ed278 100644 --- a/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ConsumerGroupService.java +++ b/kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ConsumerGroupService.java @@ -2,6 +2,7 @@ import com.provectus.kafka.ui.model.ConsumerGroupOrderingDTO; import com.provectus.kafka.ui.model.InternalConsumerGroup; +import com.provectus.kafka.ui.model.InternalTopicConsumerGroup; import com.provectus.kafka.ui.model.KafkaCluster; import com.provectus.kafka.ui.model.SortOrderDTO; import java.util.ArrayList; @@ -30,7 +31,6 @@ import reactor.util.function.Tuple2; import reactor.util.function.Tuples; - @Service @RequiredArgsConstructor public class ConsumerGroupService { @@ -71,37 +71,38 @@ public Mono<List<InternalConsumerGroup>> getAllConsumerGroups(KafkaCluster clust .flatMap(descriptions -> getConsumerGroups(ac, descriptions))); } - public Mono<List<InternalConsumerGroup>> getConsumerGroupsForTopic(KafkaCluster cluster, - String topic) { + public Mono<List<InternalTopicConsumerGroup>> getConsumerGroupsForTopic(KafkaCluster cluster, + String topic) { return adminClientService.get(cluster) // 1. getting topic's end offsets .flatMap(ac -> ac.listOffsets(topic, OffsetSpec.latest()) .flatMap(endOffsets -> { var tps = new ArrayList<>(endOffsets.keySet()); // 2. getting all consumer groups - return ac.listConsumerGroups() - .flatMap((List<String> groups) -> + return describeConsumerGroups(ac, null) + .flatMap((List<ConsumerGroupDescription> groups) -> Flux.fromIterable(groups) // 3. for each group trying to find committed offsets for topic .flatMap(g -> - ac.listConsumerGroupOffsets(g, tps) - .map(offsets -> Tuples.of(g, offsets))) - .filter(t -> !t.getT2().isEmpty()) - .collectMap(Tuple2::getT1, Tuple2::getT2) - ) - .flatMap((Map<String, Map<TopicPartition, Long>> groupOffsets) -> - // 4. getting description for groups with non-emtpy offsets - ac.describeConsumerGroups(groupOffsets.keySet()) - .map((Map<String, ConsumerGroupDescription> descriptions) -> - descriptions.values().stream().map(desc -> - // 5. gathering into InternalConsumerGroup - InternalConsumerGroup.create( - desc, groupOffsets.get(desc.groupId()), endOffsets) - ) - .collect(Collectors.toList()))); + ac.listConsumerGroupOffsets(g.groupId(), tps) + // 4. keeping only groups that relates to topic + .filter(offsets -> isConsumerGroupRelatesToTopic(topic, g, offsets)) + // 5. constructing results + .map(offsets -> InternalTopicConsumerGroup.create(topic, g, offsets, endOffsets)) + ).collectList()); })); } + private boolean isConsumerGroupRelatesToTopic(String topic, + ConsumerGroupDescription description, + Map<TopicPartition, Long> committedGroupOffsetsForTopic) { + boolean hasActiveMembersForTopic = description.members() + .stream() + .anyMatch(m -> m.assignment().topicPartitions().stream().anyMatch(tp -> tp.topic().equals(topic))); + boolean hasCommittedOffsets = !committedGroupOffsetsForTopic.isEmpty(); + return hasActiveMembersForTopic || hasCommittedOffsets; + } + @Value public static class ConsumerGroupsPage { List<InternalConsumerGroup> consumerGroups; diff --git a/kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml b/kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml index 0ac9e6c46e8..e06c4537811 100644 --- a/kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml +++ b/kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml @@ -2329,6 +2329,7 @@ components: messagesBehind: type: integer format: int64 + description: null if consumer group has no offsets committed required: - groupId @@ -2542,6 +2543,7 @@ components: messagesBehind: type: integer format: int64 + description: null if consumer group has no offsets committed consumerId: type: string host:
null
val
train
2022-09-29T13:34:48
"2022-05-31T13:58:09Z"
armenuikafka
train
provectus/kafka-ui/2531_2629
provectus/kafka-ui
provectus/kafka-ui/2531
provectus/kafka-ui/2629
[ "keyword_pr_to_issue", "timestamp(timedelta=1.0, similarity=0.9999999999999998)", "connected" ]
3733729a55123781d7e5ca5ea94be582602be93a
f31b965d664bdf3d76ac1fb37f59d174b8feb7bb
[ "`/api/clusters/<cluster>/topics/<topicName>/consumer-groups`\r\ntopicName should be an exact one" ]
[ "You can use search without extra state params. Just check how Search component works with searchParams" ]
"2022-09-22T14:38:49Z"
[ "type/enhancement", "good first issue", "scope/frontend", "status/accepted" ]
Implement Search within Consumer group profile
**Describe the bug** Please implement Search by Topic Name within Consumer group profile **Set up** https://www.kafka-ui.provectus.io/ **Steps to Reproduce** 1. Navigate to Consumer group profile **Expected behavior** Should be available Search by Topic Name **Screenshots** <img width="1326" alt="search within consumer" src="https://user-images.githubusercontent.com/104780608/188561560-c59178b9-33c2-42da-ac4a-27bf87e5d882.png"> **Additional context** <!-- (Add any other context about the problem here) -->
[ "kafka-ui-react-app/src/components/ConsumerGroups/Details/Details.tsx", "kafka-ui-react-app/src/components/ConsumerGroups/Details/__tests__/Details.spec.tsx" ]
[ "kafka-ui-react-app/src/components/ConsumerGroups/Details/Details.tsx", "kafka-ui-react-app/src/components/ConsumerGroups/Details/__tests__/Details.spec.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/ConsumerGroups/Details/Details.tsx b/kafka-ui-react-app/src/components/ConsumerGroups/Details/Details.tsx index f1ce947fdb7..d45685d7638 100644 --- a/kafka-ui-react-app/src/components/ConsumerGroups/Details/Details.tsx +++ b/kafka-ui-react-app/src/components/ConsumerGroups/Details/Details.tsx @@ -1,11 +1,12 @@ import React from 'react'; -import { useNavigate } from 'react-router-dom'; +import { useNavigate, useSearchParams } from 'react-router-dom'; import useAppParams from 'lib/hooks/useAppParams'; import { clusterConsumerGroupResetRelativePath, clusterConsumerGroupsPath, ClusterGroupParam, } from 'lib/paths'; +import Search from 'components/common/Search/Search'; import PageLoader from 'components/common/PageLoader/PageLoader'; import ClusterContext from 'components/contexts/ClusterContext'; import PageHeading from 'components/common/PageHeading/PageHeading'; @@ -24,11 +25,14 @@ import { } from 'redux/reducers/consumerGroups/consumerGroupsSlice'; import getTagColor from 'components/common/Tag/getTagColor'; import { Dropdown, DropdownItem } from 'components/common/Dropdown'; +import { ControlPanelWrapper } from 'components/common/ControlPanel/ControlPanel.styled'; import ListItem from './ListItem'; const Details: React.FC = () => { const navigate = useNavigate(); + const [searchParams] = useSearchParams(); + const searchValue = searchParams.get('q') || ''; const { isReadOnly } = React.useContext(ClusterContext); const { consumerGroupID, clusterName } = useAppParams<ClusterGroupParam>(); const dispatch = useAppDispatch(); @@ -62,6 +66,14 @@ const Details: React.FC = () => { const partitionsByTopic = groupBy(consumerGroup.partitions, 'topic'); + const filteredPartitionsByTopic = Object.keys(partitionsByTopic).filter( + (el) => el.includes(searchValue) + ); + + const currentPartitionsByTopic = searchValue.length + ? filteredPartitionsByTopic + : Object.keys(partitionsByTopic); + return ( <div> <div> @@ -108,6 +120,9 @@ const Details: React.FC = () => { </Metrics.Indicator> </Metrics.Section> </Metrics.Wrapper> + <ControlPanelWrapper hasInput style={{ margin: '16px 0 20px' }}> + <Search placeholder="Search by Topic Name" /> + </ControlPanelWrapper> <Table isFullwidth> <thead> <tr> @@ -116,7 +131,7 @@ const Details: React.FC = () => { </tr> </thead> <tbody> - {Object.keys(partitionsByTopic).map((key) => ( + {currentPartitionsByTopic.map((key) => ( <ListItem clusterName={clusterName} consumers={partitionsByTopic[key]} diff --git a/kafka-ui-react-app/src/components/ConsumerGroups/Details/__tests__/Details.spec.tsx b/kafka-ui-react-app/src/components/ConsumerGroups/Details/__tests__/Details.spec.tsx index 7a99ee07138..e16eae8669b 100644 --- a/kafka-ui-react-app/src/components/ConsumerGroups/Details/__tests__/Details.spec.tsx +++ b/kafka-ui-react-app/src/components/ConsumerGroups/Details/__tests__/Details.spec.tsx @@ -77,6 +77,13 @@ describe('Details component', () => { ); }); + it('renders search input', async () => { + await renderComponent(); + expect( + screen.getByPlaceholderText('Search by Topic Name') + ).toBeInTheDocument(); + }); + it('shows confirmation modal on consumer group delete', async () => { expect(screen.queryByRole('dialog')).not.toBeInTheDocument(); userEvent.click(screen.getByText('Delete consumer group'));
null
train
train
2022-09-27T11:03:56
"2022-09-06T06:28:24Z"
armenuikafka
train
provectus/kafka-ui/2332_2637
provectus/kafka-ui
provectus/kafka-ui/2332
provectus/kafka-ui/2637
[ "keyword_pr_to_issue", "timestamp(timedelta=1.0, similarity=1.0)" ]
67eea972f77c1e133e9e7360740189a0fa3373f2
80eb2dccfea9b15b64e927493fbbc5158549c935
[ "Tabs are still on top for topics:\r\n![image](https://user-images.githubusercontent.com/112083452/192267959-a6ef8be9-7886-4d1f-b3c5-bd2a17a08b67.png)\r\n\r\nNeeds a fix.", "@Haarolean considering the comment from Oleg in #2637 , can this considered to WAD for the topics and therefore closed?" ]
[]
"2022-09-23T12:29:19Z"
[ "good first issue", "scope/frontend", "type/refactoring", "status/accepted", "hacktoberfest" ]
Unify tabs placement
e.g. connector overview page: <img width="831" alt="image" src="https://user-images.githubusercontent.com/1494347/180577153-3eaa5d32-9d3a-4c99-b470-81db2759a803.png"> tabs are placed above the panes and the page looks empty and weird. Let's unify **all the views** with this approach: <img width="671" alt="image" src="https://user-images.githubusercontent.com/1494347/180577197-385faf9c-1500-4fb0-836a-52d87191eed8.png"> broker overview page panes are located below the tabs therefore "overview" tab is no longer required
[ "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorDetails.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/utilities/qaseIoUtils/QaseExtension.java", "kafka-ui-react-app/src/components/Connect/Details/DetailsPage.tsx", "kafka-ui-react-app/src/components/Connect/Details/__tests__/DetailsPage.spec.tsx" ]
[ "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorDetails.java", "kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/utilities/qaseIoUtils/QaseExtension.java", "kafka-ui-react-app/src/components/Connect/Details/DetailsPage.tsx", "kafka-ui-react-app/src/components/Connect/Details/__tests__/DetailsPage.spec.tsx" ]
[]
diff --git a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorDetails.java b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorDetails.java index a63a1f47c16..82768781d4d 100644 --- a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorDetails.java +++ b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/connector/ConnectorDetails.java @@ -22,7 +22,6 @@ public class ConnectorDetails { public ConnectorDetails waitUntilScreenReady() { $(By.xpath("//a[text() ='Tasks']")).shouldBe(Condition.visible); $(By.xpath("//a[text() ='Config']")).shouldBe(Condition.visible); - $(By.xpath("//a[text() ='Overview']")).shouldBe(Condition.visible); return this; } diff --git a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/utilities/qaseIoUtils/QaseExtension.java b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/utilities/qaseIoUtils/QaseExtension.java index fcc19f31e2d..52c3ff9b908 100644 --- a/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/utilities/qaseIoUtils/QaseExtension.java +++ b/kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/utilities/qaseIoUtils/QaseExtension.java @@ -37,7 +37,6 @@ public class QaseExtension implements TestExecutionListener { private static final String QASE_PROJECT = "KAFKAUI"; private static final String QASE_ENABLE = "true"; - static { String qaseApiToken = System.getProperty("QASEIO_API_TOKEN"); if (qaseApiToken == null || StringUtils.isEmpty(qaseApiToken)) { diff --git a/kafka-ui-react-app/src/components/Connect/Details/DetailsPage.tsx b/kafka-ui-react-app/src/components/Connect/Details/DetailsPage.tsx index 2ec9632b5e0..a5175ea8bfa 100644 --- a/kafka-ui-react-app/src/components/Connect/Details/DetailsPage.tsx +++ b/kafka-ui-react-app/src/components/Connect/Details/DetailsPage.tsx @@ -5,8 +5,6 @@ import { clusterConnectConnectorConfigPath, clusterConnectConnectorConfigRelativePath, clusterConnectConnectorPath, - clusterConnectConnectorTasksPath, - clusterConnectConnectorTasksRelativePath, clusterConnectorsPath, RouterParamsClusterConnectConnector, } from 'lib/paths'; @@ -32,6 +30,7 @@ const DetailsPage: React.FC = () => { > <Actions /> </PageHeading> + <Overview /> <Navbar role="navigation"> <NavLink to={clusterConnectConnectorPath( @@ -41,16 +40,6 @@ const DetailsPage: React.FC = () => { )} className={({ isActive }) => (isActive ? 'is-active' : '')} end - > - Overview - </NavLink> - <NavLink - to={clusterConnectConnectorTasksPath( - clusterName, - connectName, - connectorName - )} - className={({ isActive }) => (isActive ? 'is-active' : '')} > Tasks </NavLink> @@ -67,11 +56,7 @@ const DetailsPage: React.FC = () => { </Navbar> <Suspense fallback={<PageLoader />}> <Routes> - <Route index element={<Overview />} /> - <Route - path={clusterConnectConnectorTasksRelativePath} - element={<Tasks />} - /> + <Route index element={<Tasks />} /> <Route path={clusterConnectConnectorConfigRelativePath} element={<Config />} diff --git a/kafka-ui-react-app/src/components/Connect/Details/__tests__/DetailsPage.spec.tsx b/kafka-ui-react-app/src/components/Connect/Details/__tests__/DetailsPage.spec.tsx index 8971111d4ba..46aa1037bef 100644 --- a/kafka-ui-react-app/src/components/Connect/Details/__tests__/DetailsPage.spec.tsx +++ b/kafka-ui-react-app/src/components/Connect/Details/__tests__/DetailsPage.spec.tsx @@ -3,14 +3,13 @@ import { render, WithRoute } from 'lib/testHelpers'; import { clusterConnectConnectorConfigPath, clusterConnectConnectorPath, - clusterConnectConnectorTasksPath, getNonExactPath, } from 'lib/paths'; import { screen } from '@testing-library/dom'; import DetailsPage from 'components/Connect/Details/DetailsPage'; const DetailsCompText = { - overview: 'Overview Page', + overview: 'Overview Pane', tasks: 'Tasks Page', config: 'Config Page', actions: 'Actions', @@ -55,19 +54,14 @@ describe('Details Page', () => { expect(screen.getByText(DetailsCompText.actions)); }); - describe('Router component tests', () => { - it('should test if overview is rendering', () => { - renderComponent(); - expect(screen.getByText(DetailsCompText.overview)); - }); + it('renders overview pane', () => { + renderComponent(); + expect(screen.getByText(DetailsCompText.overview)); + }); + describe('Router component tests', () => { it('should test if tasks is rendering', () => { - const path = clusterConnectConnectorTasksPath( - clusterName, - connectName, - connectorName - ); - renderComponent(path); + renderComponent(); expect(screen.getByText(DetailsCompText.tasks)); });
null
val
train
2022-10-07T14:01:58
"2022-07-22T22:38:51Z"
Haarolean
train
provectus/kafka-ui/2615_2638
provectus/kafka-ui
provectus/kafka-ui/2615
provectus/kafka-ui/2638
[ "timestamp(timedelta=44181.0, similarity=1.0)", "connected" ]
6e8ce77fd302d0b24e5c01a7959c8592e2abe65c
b940c28b5c4af3ce99d68c48ec46082463f79113
[ "Hello there BulatKha! πŸ‘‹\n\nThank you and congratulations πŸŽ‰ for opening your very first issue in this project! πŸ’–\n\nIn case you want to claim this issue, please comment down below! We will try to get back to you as soon as we can. πŸ‘€", "I'd lik to take this one." ]
[]
"2022-09-25T13:47:44Z"
[ "type/bug", "scope/frontend", "status/accepted", "status/confirmed" ]
Schema registry: Previous version's schema preview is too narrow
Preview of the previous schema version is a lot narrower, than the window size. Checked in: - Safari 15.6.1 (17613.3.9.1.16), - Chrome 105.0.5195.125 (Official Build) (x86_64) - Brave 1.43.88 Chromium: 105.0.5195.68 (Official Build) (x86_64) **Set up** App version: e621a172d5d9a065dd1d6064099bac3aadc32549 ( [e621a17](https://github.com/provectus/kafka-ui/commit/e621a17) ) URL: http://master.internal.kafka-ui.provectus.io/ui/clusters/local/schemas/avro2 **Steps to Reproduce** 1. Navigate to "Schema registry"2. 2. Open one of the schemas. 3. click on the plus button in "Old versions" section. **Actual behavior** Preview looks very narrow and barely useable. ![image](https://user-images.githubusercontent.com/112083452/191220995-3eb0c48b-cdd9-4d48-a127-8269050ea27e.png) **Expected behavior** A schema preview should match the "Old versions" container width. Setting minimum width for the preview might be a plus. **Additional context**
[ "kafka-ui-react-app/src/components/Schemas/Details/SchemaVersion/SchemaVersion.tsx" ]
[ "kafka-ui-react-app/src/components/Schemas/Details/SchemaVersion/SchemaVersion.tsx" ]
[]
diff --git a/kafka-ui-react-app/src/components/Schemas/Details/SchemaVersion/SchemaVersion.tsx b/kafka-ui-react-app/src/components/Schemas/Details/SchemaVersion/SchemaVersion.tsx index 1a729a6eaec..472a6e544fd 100644 --- a/kafka-ui-react-app/src/components/Schemas/Details/SchemaVersion/SchemaVersion.tsx +++ b/kafka-ui-react-app/src/components/Schemas/Details/SchemaVersion/SchemaVersion.tsx @@ -30,7 +30,7 @@ const SchemaVersion: React.FC<SchemaVersionProps> = ({ </tr> {isOpen && ( <S.Wrapper> - <td colSpan={3}> + <td colSpan={4}> <EditorViewer data={schema} schemaType={schemaType} /> </td> </S.Wrapper>
null
train
val
2022-09-22T14:39:01
"2022-09-20T09:54:49Z"
BulatKha
train